Apr 21 01:50:02.288217 ip-10-0-129-52 systemd[1]: Starting Kubernetes Kubelet... Apr 21 01:50:02.797551 ip-10-0-129-52 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 01:50:02.797551 ip-10-0-129-52 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 01:50:02.797551 ip-10-0-129-52 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 01:50:02.797551 ip-10-0-129-52 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 01:50:02.797551 ip-10-0-129-52 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 01:50:02.799474 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.799372 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 01:50:02.803498 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803480 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 01:50:02.803498 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803498 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 01:50:02.803569 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803502 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 01:50:02.803569 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803506 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 01:50:02.803569 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803509 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 01:50:02.803569 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803512 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 01:50:02.803569 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803515 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 01:50:02.803569 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803518 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 01:50:02.803569 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803520 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 01:50:02.803569 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803523 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 01:50:02.803569 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803526 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 01:50:02.803569 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803528 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 01:50:02.803569 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803531 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 01:50:02.803569 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803534 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 01:50:02.803569 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803536 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 01:50:02.803569 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803539 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 01:50:02.803569 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803541 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 01:50:02.803569 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803544 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 01:50:02.803569 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803546 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 01:50:02.803569 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803549 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 01:50:02.803569 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803552 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 01:50:02.803569 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803555 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 01:50:02.804070 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803561 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 01:50:02.804070 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803564 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 01:50:02.804070 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803566 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 01:50:02.804070 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803569 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 01:50:02.804070 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803572 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 01:50:02.804070 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803576 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 01:50:02.804070 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803578 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 01:50:02.804070 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803581 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 01:50:02.804070 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803586 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 01:50:02.804070 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803589 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 01:50:02.804070 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803592 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 01:50:02.804070 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803594 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 01:50:02.804070 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803597 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 01:50:02.804070 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803601 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 01:50:02.804070 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803604 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 01:50:02.804070 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803606 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 01:50:02.804070 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803609 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 01:50:02.804070 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803611 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 01:50:02.804070 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803614 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 01:50:02.804614 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803616 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 01:50:02.804614 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803619 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 01:50:02.804614 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803622 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 01:50:02.804614 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803624 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 01:50:02.804614 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803627 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 01:50:02.804614 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803629 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 01:50:02.804614 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803632 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 01:50:02.804614 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803634 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 01:50:02.804614 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803637 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 01:50:02.804614 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803639 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 01:50:02.804614 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803642 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 01:50:02.804614 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803645 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 01:50:02.804614 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803647 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 01:50:02.804614 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803650 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 01:50:02.804614 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803654 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 01:50:02.804614 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803657 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 01:50:02.804614 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803659 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 01:50:02.804614 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803662 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 01:50:02.804614 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803666 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 01:50:02.805108 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803670 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 01:50:02.805108 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803673 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 01:50:02.805108 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803676 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 01:50:02.805108 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803678 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 01:50:02.805108 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803681 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 01:50:02.805108 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803683 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 01:50:02.805108 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803686 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 01:50:02.805108 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803688 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 01:50:02.805108 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803691 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 01:50:02.805108 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803693 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 01:50:02.805108 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803696 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 01:50:02.805108 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803698 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 01:50:02.805108 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803701 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 01:50:02.805108 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803703 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 01:50:02.805108 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803706 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 01:50:02.805108 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803710 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 01:50:02.805108 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803713 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 01:50:02.805108 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803716 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 01:50:02.805108 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803719 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 01:50:02.805108 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803722 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 01:50:02.805594 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803724 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 01:50:02.805594 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803727 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 01:50:02.805594 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803729 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 01:50:02.805594 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803732 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 01:50:02.805594 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803734 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 01:50:02.805594 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.803737 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 01:50:02.805594 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804166 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 01:50:02.805594 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804172 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 01:50:02.805594 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804174 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 01:50:02.805594 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804178 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 01:50:02.805594 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804180 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 01:50:02.805594 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804183 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 01:50:02.805594 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804186 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 01:50:02.805594 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804189 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 01:50:02.805594 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804192 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 01:50:02.805594 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804195 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 01:50:02.805594 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804197 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 01:50:02.805594 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804200 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 01:50:02.805594 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804202 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 01:50:02.805594 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804205 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 01:50:02.806146 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804207 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 01:50:02.806146 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804210 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 01:50:02.806146 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804212 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 01:50:02.806146 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804215 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 01:50:02.806146 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804218 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 01:50:02.806146 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804221 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 01:50:02.806146 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804223 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 01:50:02.806146 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804226 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 01:50:02.806146 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804229 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 01:50:02.806146 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804232 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 01:50:02.806146 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804234 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 01:50:02.806146 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804237 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 01:50:02.806146 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804239 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 01:50:02.806146 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804242 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 01:50:02.806146 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804244 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 01:50:02.806146 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804247 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 01:50:02.806146 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804249 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 01:50:02.806146 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804251 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 01:50:02.806146 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804254 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 01:50:02.806621 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804257 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 01:50:02.806621 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804260 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 01:50:02.806621 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804264 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 01:50:02.806621 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804267 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 01:50:02.806621 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804269 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 01:50:02.806621 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804272 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 01:50:02.806621 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804274 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 01:50:02.806621 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804277 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 01:50:02.806621 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804280 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 01:50:02.806621 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804282 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 01:50:02.806621 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804285 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 01:50:02.806621 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804294 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 01:50:02.806621 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804297 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 01:50:02.806621 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804300 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 01:50:02.806621 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804302 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 01:50:02.806621 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804305 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 01:50:02.806621 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804307 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 01:50:02.806621 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804309 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 01:50:02.806621 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804312 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 01:50:02.806621 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804314 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 01:50:02.807143 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804317 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 01:50:02.807143 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804320 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 01:50:02.807143 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804323 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 01:50:02.807143 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804325 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 01:50:02.807143 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804328 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 01:50:02.807143 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804333 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 01:50:02.807143 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804336 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 01:50:02.807143 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804339 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 01:50:02.807143 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804342 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 01:50:02.807143 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804345 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 01:50:02.807143 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804348 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 01:50:02.807143 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804351 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 01:50:02.807143 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804355 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 01:50:02.807143 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804358 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 01:50:02.807143 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804361 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 01:50:02.807143 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804364 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 01:50:02.807143 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804367 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 01:50:02.807143 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804369 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 01:50:02.807143 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804372 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 01:50:02.807606 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804375 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 01:50:02.807606 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804378 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 01:50:02.807606 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804380 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 01:50:02.807606 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804383 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 01:50:02.807606 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804386 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 01:50:02.807606 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804388 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 01:50:02.807606 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804391 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 01:50:02.807606 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804393 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 01:50:02.807606 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804396 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 01:50:02.807606 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804398 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 01:50:02.807606 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804401 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 01:50:02.807606 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804403 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 01:50:02.807606 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804406 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 01:50:02.807606 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.804408 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 01:50:02.807606 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805254 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 01:50:02.807606 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805268 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 01:50:02.807606 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805275 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 01:50:02.807606 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805280 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 01:50:02.807606 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805285 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 01:50:02.807606 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805288 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 01:50:02.807606 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805293 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805298 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805301 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805304 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805308 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805314 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805317 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805321 2573 flags.go:64] FLAG: --cgroup-root="" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805324 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805327 2573 flags.go:64] FLAG: --client-ca-file="" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805329 2573 flags.go:64] FLAG: --cloud-config="" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805332 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805335 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805341 2573 flags.go:64] FLAG: --cluster-domain="" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805344 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805347 2573 flags.go:64] FLAG: --config-dir="" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805350 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805353 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805357 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805361 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805364 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805368 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805371 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805373 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 01:50:02.808127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805376 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805380 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805383 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805388 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805391 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805395 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805397 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805401 2573 flags.go:64] FLAG: --enable-server="true" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805404 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805408 2573 flags.go:64] FLAG: --event-burst="100" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805412 2573 flags.go:64] FLAG: --event-qps="50" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805415 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805418 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805421 2573 flags.go:64] FLAG: --eviction-hard="" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805425 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805428 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805431 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805434 2573 flags.go:64] FLAG: --eviction-soft="" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805437 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805440 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805443 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805446 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805449 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805452 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805455 2573 flags.go:64] FLAG: --feature-gates="" Apr 21 01:50:02.808730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805459 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805462 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805465 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805468 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805472 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805475 2573 flags.go:64] FLAG: --help="false" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805478 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-129-52.ec2.internal" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805481 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805484 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805487 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805491 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805494 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805502 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805505 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805509 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805511 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805515 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805518 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805521 2573 flags.go:64] FLAG: --kube-reserved="" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805524 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805527 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805530 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805532 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805535 2573 flags.go:64] FLAG: --lock-file="" Apr 21 01:50:02.809364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805538 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805541 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805544 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805549 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805552 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805555 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805558 2573 flags.go:64] FLAG: --logging-format="text" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805561 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805565 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805568 2573 flags.go:64] FLAG: --manifest-url="" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805570 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805575 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805578 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805583 2573 flags.go:64] FLAG: --max-pods="110" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805586 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805589 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805592 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805594 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805598 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805601 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805604 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805614 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805617 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805620 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 01:50:02.809958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805623 2573 flags.go:64] FLAG: --pod-cidr="" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805626 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805632 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805635 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805638 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805641 2573 flags.go:64] FLAG: --port="10250" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805644 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805647 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00f85cd5aba4fd209" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805650 2573 flags.go:64] FLAG: --qos-reserved="" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805653 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805656 2573 flags.go:64] FLAG: --register-node="true" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805659 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805662 2573 flags.go:64] FLAG: --register-with-taints="" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805666 2573 flags.go:64] FLAG: --registry-burst="10" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805669 2573 flags.go:64] FLAG: --registry-qps="5" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805672 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805674 2573 flags.go:64] FLAG: --reserved-memory="" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805679 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805682 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805685 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805688 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805691 2573 flags.go:64] FLAG: --runonce="false" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805694 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805697 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805701 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 21 01:50:02.810541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805704 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805707 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805710 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805713 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805717 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805720 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805723 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805726 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805729 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805732 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805734 2573 flags.go:64] FLAG: --system-cgroups="" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805737 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805743 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805746 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805749 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805753 2573 flags.go:64] FLAG: --tls-min-version="" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805756 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805759 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805762 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805765 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805768 2573 flags.go:64] FLAG: --v="2" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805772 2573 flags.go:64] FLAG: --version="false" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805776 2573 flags.go:64] FLAG: --vmodule="" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805781 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.805785 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 01:50:02.811186 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805907 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 01:50:02.811830 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805912 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 01:50:02.811830 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805916 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 01:50:02.811830 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805919 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 01:50:02.811830 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805922 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 01:50:02.811830 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805925 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 01:50:02.811830 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805928 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 01:50:02.811830 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805931 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 01:50:02.811830 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805933 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 01:50:02.811830 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805936 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 01:50:02.811830 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805939 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 01:50:02.811830 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805942 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 01:50:02.811830 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805944 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 01:50:02.811830 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805948 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 01:50:02.811830 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805950 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 01:50:02.811830 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805953 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 01:50:02.811830 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805956 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 01:50:02.811830 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805958 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 01:50:02.811830 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805961 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 01:50:02.811830 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805963 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 01:50:02.812291 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805966 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 01:50:02.812291 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805969 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 01:50:02.812291 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805972 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 01:50:02.812291 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805974 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 01:50:02.812291 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805976 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 01:50:02.812291 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805982 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 01:50:02.812291 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805985 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 01:50:02.812291 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805987 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 01:50:02.812291 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805990 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 01:50:02.812291 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805992 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 01:50:02.812291 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805995 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 01:50:02.812291 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.805998 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 01:50:02.812291 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806001 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 01:50:02.812291 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806003 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 01:50:02.812291 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806006 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 01:50:02.812291 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806008 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 01:50:02.812291 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806011 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 01:50:02.812291 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806013 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 01:50:02.812291 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806016 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 01:50:02.812291 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806018 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 01:50:02.812838 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806020 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 01:50:02.812838 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806023 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 01:50:02.812838 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806026 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 01:50:02.812838 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806028 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 01:50:02.812838 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806031 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 01:50:02.812838 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806034 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 01:50:02.812838 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806037 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 01:50:02.812838 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806039 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 01:50:02.812838 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806042 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 01:50:02.812838 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806045 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 01:50:02.812838 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806047 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 01:50:02.812838 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806051 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 01:50:02.812838 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806055 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 01:50:02.812838 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806057 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 01:50:02.812838 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806060 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 01:50:02.812838 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806063 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 01:50:02.812838 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806065 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 01:50:02.812838 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806069 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 01:50:02.812838 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806072 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 01:50:02.813321 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806074 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 01:50:02.813321 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806077 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 01:50:02.813321 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806080 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 01:50:02.813321 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806082 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 01:50:02.813321 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806085 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 01:50:02.813321 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806088 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 01:50:02.813321 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806090 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 01:50:02.813321 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806093 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 01:50:02.813321 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806095 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 01:50:02.813321 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806098 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 01:50:02.813321 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806100 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 01:50:02.813321 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806102 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 01:50:02.813321 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806105 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 01:50:02.813321 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806107 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 01:50:02.813321 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806110 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 01:50:02.813321 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806112 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 01:50:02.813321 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806115 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 01:50:02.813321 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806117 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 01:50:02.813321 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806120 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 01:50:02.813321 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806123 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 01:50:02.813861 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806126 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 01:50:02.813861 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806128 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 01:50:02.813861 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806131 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 01:50:02.813861 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806133 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 01:50:02.813861 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806136 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 01:50:02.813861 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806138 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 01:50:02.813861 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.806142 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 01:50:02.813861 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.807081 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 01:50:02.814314 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.814291 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 01:50:02.814347 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.814315 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 01:50:02.814389 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814380 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 01:50:02.814389 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814389 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 01:50:02.814445 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814392 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 01:50:02.814445 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814395 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 01:50:02.814445 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814399 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 01:50:02.814445 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814401 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 01:50:02.814445 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814404 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 01:50:02.814445 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814407 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 01:50:02.814445 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814409 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 01:50:02.814445 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814412 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 01:50:02.814445 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814414 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 01:50:02.814445 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814418 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 01:50:02.814445 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814420 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 01:50:02.814445 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814423 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 01:50:02.814445 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814425 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 01:50:02.814445 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814428 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 01:50:02.814445 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814430 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 01:50:02.814445 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814433 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 01:50:02.814445 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814436 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 01:50:02.814445 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814439 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 01:50:02.814445 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814442 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 01:50:02.814445 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814444 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 01:50:02.814948 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814448 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 01:50:02.814948 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814450 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 01:50:02.814948 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814453 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 01:50:02.814948 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814456 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 01:50:02.814948 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814459 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 01:50:02.814948 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814461 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 01:50:02.814948 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814464 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 01:50:02.814948 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814467 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 01:50:02.814948 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814469 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 01:50:02.814948 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814473 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 01:50:02.814948 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814476 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 01:50:02.814948 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814478 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 01:50:02.814948 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814481 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 01:50:02.814948 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814484 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 01:50:02.814948 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814486 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 01:50:02.814948 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814490 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 01:50:02.814948 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814494 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 01:50:02.814948 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814497 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 01:50:02.814948 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814500 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 01:50:02.814948 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814502 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 01:50:02.815470 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814504 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 01:50:02.815470 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814507 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 01:50:02.815470 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814509 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 01:50:02.815470 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814512 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 01:50:02.815470 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814514 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 01:50:02.815470 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814517 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 01:50:02.815470 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814519 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 01:50:02.815470 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814522 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 01:50:02.815470 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814524 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 01:50:02.815470 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814527 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 01:50:02.815470 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814529 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 01:50:02.815470 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814532 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 01:50:02.815470 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814535 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 01:50:02.815470 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814538 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 01:50:02.815470 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814541 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 01:50:02.815470 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814544 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 01:50:02.815470 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814547 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 01:50:02.815470 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814549 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 01:50:02.815470 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814552 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 01:50:02.815470 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814554 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 01:50:02.815989 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814557 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 01:50:02.815989 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814559 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 01:50:02.815989 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814563 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 01:50:02.815989 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814566 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 01:50:02.815989 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814568 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 01:50:02.815989 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814571 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 01:50:02.815989 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814573 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 01:50:02.815989 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814576 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 01:50:02.815989 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814578 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 01:50:02.815989 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814581 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 01:50:02.815989 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814583 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 01:50:02.815989 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814586 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 01:50:02.815989 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814588 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 01:50:02.815989 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814591 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 01:50:02.815989 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814593 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 01:50:02.815989 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814596 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 01:50:02.815989 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814598 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 01:50:02.815989 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814601 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 01:50:02.815989 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814603 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 01:50:02.816462 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814607 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 01:50:02.816462 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814612 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 01:50:02.816462 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814615 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 01:50:02.816462 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814618 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 01:50:02.816462 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814621 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 01:50:02.816462 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.814626 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 01:50:02.816462 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814790 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 01:50:02.816462 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814797 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 01:50:02.816462 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814800 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 01:50:02.816462 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814803 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 01:50:02.816462 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814806 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 01:50:02.816462 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814823 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 01:50:02.816462 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814827 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 01:50:02.816462 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814830 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 01:50:02.816462 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814832 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 01:50:02.816854 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814835 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 01:50:02.816854 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814838 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 01:50:02.816854 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814841 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 01:50:02.816854 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814844 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 01:50:02.816854 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814847 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 01:50:02.816854 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814849 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 01:50:02.816854 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814852 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 01:50:02.816854 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814854 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 01:50:02.816854 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814857 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 01:50:02.816854 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814860 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 01:50:02.816854 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814862 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 01:50:02.816854 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814865 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 01:50:02.816854 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814868 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 01:50:02.816854 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814870 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 01:50:02.816854 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814872 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 01:50:02.816854 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814875 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 01:50:02.816854 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814877 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 01:50:02.816854 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814880 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 01:50:02.816854 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814883 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 01:50:02.816854 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814885 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 01:50:02.817347 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814888 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 01:50:02.817347 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814890 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 01:50:02.817347 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814893 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 01:50:02.817347 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814895 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 01:50:02.817347 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814899 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 01:50:02.817347 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814901 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 01:50:02.817347 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814904 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 01:50:02.817347 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814907 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 01:50:02.817347 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814909 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 01:50:02.817347 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814912 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 01:50:02.817347 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814915 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 01:50:02.817347 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814917 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 01:50:02.817347 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814920 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 01:50:02.817347 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814922 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 01:50:02.817347 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814931 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 01:50:02.817347 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814933 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 01:50:02.817347 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814936 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 01:50:02.817347 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814938 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 01:50:02.817347 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814941 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 01:50:02.817347 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814944 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 01:50:02.817846 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814946 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 01:50:02.817846 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814949 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 01:50:02.817846 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814954 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 01:50:02.817846 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814958 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 01:50:02.817846 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814961 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 01:50:02.817846 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814964 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 01:50:02.817846 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814967 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 01:50:02.817846 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814970 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 01:50:02.817846 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814972 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 01:50:02.817846 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814974 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 01:50:02.817846 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814977 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 01:50:02.817846 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814979 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 01:50:02.817846 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814982 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 01:50:02.817846 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814984 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 01:50:02.817846 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814987 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 01:50:02.817846 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814990 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 01:50:02.817846 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814993 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 01:50:02.817846 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814996 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 01:50:02.817846 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.814998 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 01:50:02.818317 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.815001 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 01:50:02.818317 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.815004 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 01:50:02.818317 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.815006 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 01:50:02.818317 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.815009 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 01:50:02.818317 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.815011 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 01:50:02.818317 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.815014 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 01:50:02.818317 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.815016 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 01:50:02.818317 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.815019 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 01:50:02.818317 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.815022 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 01:50:02.818317 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.815025 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 01:50:02.818317 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.815028 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 01:50:02.818317 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.815032 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 01:50:02.818317 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.815035 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 01:50:02.818317 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.815037 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 01:50:02.818317 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.815039 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 01:50:02.818317 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.815042 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 01:50:02.818317 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.815044 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 01:50:02.818317 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:02.815047 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 01:50:02.818766 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.815051 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 01:50:02.818766 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.815927 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 01:50:02.818766 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.818053 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 01:50:02.819272 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.819259 2573 server.go:1019] "Starting client certificate rotation" Apr 21 01:50:02.819384 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.819364 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 01:50:02.819423 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.819413 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 01:50:02.848909 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.848883 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 01:50:02.852179 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.852143 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 01:50:02.868972 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.868945 2573 log.go:25] "Validated CRI v1 runtime API" Apr 21 01:50:02.876663 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.876638 2573 log.go:25] "Validated CRI v1 image API" Apr 21 01:50:02.877997 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.877964 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 01:50:02.878103 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.878037 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 01:50:02.884395 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.884364 2573 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 7c572479-cbe7-46d1-b165-33e59562b15e:/dev/nvme0n1p4 d0e4f275-d7ea-498a-86cd-154383b3f0c5:/dev/nvme0n1p3] Apr 21 01:50:02.884395 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.884391 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 01:50:02.890550 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.890411 2573 manager.go:217] Machine: {Timestamp:2026-04-21 01:50:02.888339976 +0000 UTC m=+0.467642024 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100109 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2679690dd7360d863e391a4bdd5d8e SystemUUID:ec267969-0dd7-360d-863e-391a4bdd5d8e BootID:7ede13ad-a031-4b6f-a137-03ec98a335e0 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ee:59:35:55:89 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ee:59:35:55:89 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:32:36:40:e8:db:0e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 01:50:02.890550 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.890536 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 01:50:02.890694 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.890633 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 01:50:02.892920 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.892884 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 01:50:02.893077 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.892924 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-52.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 01:50:02.893125 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.893083 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 01:50:02.893125 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.893093 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 01:50:02.893125 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.893107 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 01:50:02.894696 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.894683 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 01:50:02.896331 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.896318 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 21 01:50:02.896473 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.896463 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 01:50:02.897059 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.897038 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2wcw4" Apr 21 01:50:02.899996 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.899983 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 21 01:50:02.900060 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.900002 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 01:50:02.900060 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.900016 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 01:50:02.900060 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.900029 2573 kubelet.go:397] "Adding apiserver pod source" Apr 21 01:50:02.900060 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.900038 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 01:50:02.901376 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.901358 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 01:50:02.901376 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.901378 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 01:50:02.902285 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.902265 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2wcw4" Apr 21 01:50:02.905061 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.905041 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 01:50:02.906993 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.906979 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 01:50:02.908688 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.908677 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 01:50:02.908730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.908700 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 01:50:02.908730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.908707 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 01:50:02.908730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.908713 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 01:50:02.908730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.908718 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 01:50:02.908730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.908725 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 01:50:02.908730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.908731 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 01:50:02.908912 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.908737 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 01:50:02.908912 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.908745 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 01:50:02.908912 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.908751 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 01:50:02.908912 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.908764 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 01:50:02.908912 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.908773 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 01:50:02.909839 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.909827 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 01:50:02.909872 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.909843 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 01:50:02.914464 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.914447 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 01:50:02.914576 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.914499 2573 server.go:1295] "Started kubelet" Apr 21 01:50:02.914656 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.914592 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 01:50:02.914795 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.914738 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 01:50:02.914856 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.914826 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 01:50:02.915291 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.915265 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 01:50:02.915500 ip-10-0-129-52 systemd[1]: Started Kubernetes Kubelet. Apr 21 01:50:02.916896 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.916879 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 01:50:02.917970 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.917955 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 21 01:50:02.918747 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.918720 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 01:50:02.922852 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.922832 2573 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-52.ec2.internal" not found Apr 21 01:50:02.923435 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.923420 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 01:50:02.923494 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.923432 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 01:50:02.924364 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:02.924345 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-52.ec2.internal\" not found" Apr 21 01:50:02.924439 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.924427 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 01:50:02.924484 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.924442 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 01:50:02.924524 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.924430 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 01:50:02.924572 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.924559 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 21 01:50:02.924630 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.924573 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 21 01:50:02.925438 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.925416 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 01:50:02.925945 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.925925 2573 factory.go:55] Registering systemd factory Apr 21 01:50:02.925945 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.925950 2573 factory.go:223] Registration of the systemd container factory successfully Apr 21 01:50:02.926248 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.926229 2573 factory.go:153] Registering CRI-O factory Apr 21 01:50:02.926248 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.926250 2573 factory.go:223] Registration of the crio container factory successfully Apr 21 01:50:02.926393 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.926328 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 01:50:02.926393 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.926344 2573 factory.go:103] Registering Raw factory Apr 21 01:50:02.926393 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.926359 2573 manager.go:1196] Started watching for new ooms in manager Apr 21 01:50:02.926772 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.926759 2573 manager.go:319] Starting recovery of all containers Apr 21 01:50:02.929491 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:02.929264 2573 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-52.ec2.internal\" not found" node="ip-10-0-129-52.ec2.internal" Apr 21 01:50:02.929570 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:02.929552 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 01:50:02.932548 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.932517 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 01:50:02.938172 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.938144 2573 manager.go:324] Recovery completed Apr 21 01:50:02.940285 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:02.940264 2573 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 21 01:50:02.940752 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.940739 2573 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-52.ec2.internal" not found Apr 21 01:50:02.943572 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.943559 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 01:50:02.945706 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.945689 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-52.ec2.internal" event="NodeHasSufficientMemory" Apr 21 01:50:02.945794 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.945723 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-52.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 01:50:02.945794 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.945735 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-52.ec2.internal" event="NodeHasSufficientPID" Apr 21 01:50:02.946257 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.946244 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 01:50:02.946308 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.946257 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 01:50:02.946308 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.946276 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 21 01:50:02.948851 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.948808 2573 policy_none.go:49] "None policy: Start" Apr 21 01:50:02.948851 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.948838 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 01:50:02.948851 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.948849 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 21 01:50:03.000634 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.989665 2573 manager.go:341] "Starting Device Plugin manager" Apr 21 01:50:03.000634 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:02.989704 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 01:50:03.000634 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.989713 2573 server.go:85] "Starting device plugin registration server" Apr 21 01:50:03.000634 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.989960 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 01:50:03.000634 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.989970 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 01:50:03.000634 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.990079 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 01:50:03.000634 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.990165 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 01:50:03.000634 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.990176 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 01:50:03.000634 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:02.990690 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 01:50:03.000634 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:02.990725 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-52.ec2.internal\" not found" Apr 21 01:50:03.000634 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:02.997544 2573 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-129-52.ec2.internal" not found Apr 21 01:50:03.058501 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.058416 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 01:50:03.058501 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.058452 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 01:50:03.058501 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.058470 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 01:50:03.058501 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.058477 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 01:50:03.058749 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:03.058510 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 01:50:03.060653 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.060630 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 01:50:03.090502 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.090465 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 01:50:03.091676 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.091657 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-52.ec2.internal" event="NodeHasSufficientMemory" Apr 21 01:50:03.091759 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.091689 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-52.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 01:50:03.091759 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.091700 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-52.ec2.internal" event="NodeHasSufficientPID" Apr 21 01:50:03.091759 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.091725 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-52.ec2.internal" Apr 21 01:50:03.100588 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.100569 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-52.ec2.internal" Apr 21 01:50:03.100667 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:03.100597 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-52.ec2.internal\": node \"ip-10-0-129-52.ec2.internal\" not found" Apr 21 01:50:03.159144 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.159105 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-52.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-52.ec2.internal"] Apr 21 01:50:03.161712 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.161691 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-52.ec2.internal" Apr 21 01:50:03.161795 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.161691 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-52.ec2.internal" Apr 21 01:50:03.185510 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.185480 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-52.ec2.internal" Apr 21 01:50:03.190125 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.190108 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-52.ec2.internal" Apr 21 01:50:03.210056 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.210030 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 01:50:03.212901 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.212885 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 01:50:03.225899 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.225877 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bc7f96eee01b7ac864ac98f5fb6e45b4-config\") pod \"kube-apiserver-proxy-ip-10-0-129-52.ec2.internal\" (UID: \"bc7f96eee01b7ac864ac98f5fb6e45b4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-52.ec2.internal" Apr 21 01:50:03.225999 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.225909 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/053154a49e4e985b6b9ca90f37194bef-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-52.ec2.internal\" (UID: \"053154a49e4e985b6b9ca90f37194bef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-52.ec2.internal" Apr 21 01:50:03.225999 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.225936 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/053154a49e4e985b6b9ca90f37194bef-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-52.ec2.internal\" (UID: \"053154a49e4e985b6b9ca90f37194bef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-52.ec2.internal" Apr 21 01:50:03.326155 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.326080 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/053154a49e4e985b6b9ca90f37194bef-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-52.ec2.internal\" (UID: \"053154a49e4e985b6b9ca90f37194bef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-52.ec2.internal" Apr 21 01:50:03.326155 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.326110 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/053154a49e4e985b6b9ca90f37194bef-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-52.ec2.internal\" (UID: \"053154a49e4e985b6b9ca90f37194bef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-52.ec2.internal" Apr 21 01:50:03.326155 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.326131 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bc7f96eee01b7ac864ac98f5fb6e45b4-config\") pod \"kube-apiserver-proxy-ip-10-0-129-52.ec2.internal\" (UID: \"bc7f96eee01b7ac864ac98f5fb6e45b4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-52.ec2.internal" Apr 21 01:50:03.326347 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.326189 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/053154a49e4e985b6b9ca90f37194bef-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-52.ec2.internal\" (UID: \"053154a49e4e985b6b9ca90f37194bef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-52.ec2.internal" Apr 21 01:50:03.326347 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.326208 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bc7f96eee01b7ac864ac98f5fb6e45b4-config\") pod \"kube-apiserver-proxy-ip-10-0-129-52.ec2.internal\" (UID: \"bc7f96eee01b7ac864ac98f5fb6e45b4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-52.ec2.internal" Apr 21 01:50:03.326347 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.326182 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/053154a49e4e985b6b9ca90f37194bef-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-52.ec2.internal\" (UID: \"053154a49e4e985b6b9ca90f37194bef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-52.ec2.internal" Apr 21 01:50:03.512657 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.512619 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-52.ec2.internal" Apr 21 01:50:03.516421 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.516399 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-52.ec2.internal" Apr 21 01:50:03.818696 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.818600 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 01:50:03.819209 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.818782 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 01:50:03.819209 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.818785 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 01:50:03.819209 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.818785 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 01:50:03.900562 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.900530 2573 apiserver.go:52] "Watching apiserver" Apr 21 01:50:03.904170 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.904130 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 01:45:02 +0000 UTC" deadline="2027-12-29 18:25:58.380174946 +0000 UTC" Apr 21 01:50:03.904170 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.904167 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14824h35m54.476010416s" Apr 21 01:50:03.907966 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.907940 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 01:50:03.909034 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.909011 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-2wlcr","openshift-ovn-kubernetes/ovnkube-node-f2tdx","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6","openshift-cluster-node-tuning-operator/tuned-l9kzx","openshift-image-registry/node-ca-xfl2x","openshift-multus/multus-additional-cni-plugins-8h5m5","openshift-multus/multus-fdckn","openshift-network-diagnostics/network-check-target-bvqj7","kube-system/konnectivity-agent-l94wt","kube-system/kube-apiserver-proxy-ip-10-0-129-52.ec2.internal","openshift-dns/node-resolver-w4c4r","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-52.ec2.internal","openshift-multus/network-metrics-daemon-mfs4c"] Apr 21 01:50:03.910566 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.910549 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2wlcr" Apr 21 01:50:03.912317 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.912294 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:03.914212 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.914191 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:03.914646 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.914629 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 01:50:03.914864 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.914849 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 01:50:03.914979 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.914963 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 01:50:03.914979 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.914974 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 01:50:03.915062 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.914975 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 01:50:03.915102 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.915063 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-nhnc9\"" Apr 21 01:50:03.915537 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.915523 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 01:50:03.916352 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.916018 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:03.917506 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.917092 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 01:50:03.917506 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.917333 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 01:50:03.917659 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.917641 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 01:50:03.917772 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.917750 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 01:50:03.917968 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.917643 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-nc6t6\"" Apr 21 01:50:03.918201 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.918023 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 01:50:03.918349 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.918145 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-h2t9f\"" Apr 21 01:50:03.918445 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.918176 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 01:50:03.918445 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.918263 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 01:50:03.918445 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.918410 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 01:50:03.918445 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.918420 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-fqxf6\"" Apr 21 01:50:03.919160 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.919143 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xfl2x" Apr 21 01:50:03.919314 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.919296 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:03.920679 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.920663 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fdckn" Apr 21 01:50:03.921172 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.921157 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 01:50:03.921350 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.921335 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-gnmgg\"" Apr 21 01:50:03.921397 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.921380 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 01:50:03.921592 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.921569 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 01:50:03.921672 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.921620 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 01:50:03.922082 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.922067 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:03.922170 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.922090 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 01:50:03.922170 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.922106 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 01:50:03.922170 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.922120 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 01:50:03.922170 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:03.922150 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvqj7" podUID="560db137-e262-4c6c-9380-c422a8537e5e" Apr 21 01:50:03.922340 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.922243 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 01:50:03.922375 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.922363 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zbrlm\"" Apr 21 01:50:03.922717 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.922704 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 01:50:03.922962 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.922947 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-f7p8j\"" Apr 21 01:50:03.923544 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.923526 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 01:50:03.923614 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.923567 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-l94wt" Apr 21 01:50:03.924897 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.924882 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w4c4r" Apr 21 01:50:03.925660 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.925646 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 01:50:03.925737 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.925665 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-rqt2m\"" Apr 21 01:50:03.925847 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.925797 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 01:50:03.926604 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.926586 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:03.926664 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:03.926639 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mfs4c" podUID="9c103689-40cc-470b-9109-33a63ff6f5dd" Apr 21 01:50:03.926952 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.926933 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 01:50:03.927032 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.926938 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 01:50:03.927032 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.926961 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4gxfw\"" Apr 21 01:50:03.927624 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.927607 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-host-var-lib-cni-multus\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:03.927710 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.927633 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-host-var-lib-kubelet\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:03.927710 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.927650 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-host-slash\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:03.927710 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.927664 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f7e3c291-0094-4e7c-9d88-7f0864c79c8d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cd5f6\" (UID: \"f7e3c291-0094-4e7c-9d88-7f0864c79c8d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:03.927710 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.927678 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-etc-sysctl-conf\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:03.927710 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.927693 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-log-socket\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:03.927967 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.927729 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ed377958-ce5b-41c7-9512-4b95b799767d-ovn-node-metrics-cert\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:03.927967 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.927766 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-host-run-netns\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:03.927967 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.927830 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm6p6\" (UniqueName: \"kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6\") pod \"network-check-target-bvqj7\" (UID: \"560db137-e262-4c6c-9380-c422a8537e5e\") " pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:03.927967 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.927869 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8bvh\" (UniqueName: \"kubernetes.io/projected/6de78603-c646-45a5-8bc4-9cfc56456d0f-kube-api-access-v8bvh\") pod \"node-ca-xfl2x\" (UID: \"6de78603-c646-45a5-8bc4-9cfc56456d0f\") " pod="openshift-image-registry/node-ca-xfl2x" Apr 21 01:50:03.927967 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.927909 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-cnibin\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:03.927967 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.927934 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-host-var-lib-cni-bin\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:03.927967 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.927960 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-etc-kubernetes\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:03.928298 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.927982 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-systemd-units\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:03.928298 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928026 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-run\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:03.928298 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928050 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b9f858d-0cf6-49d9-8632-23d2d0584e24-system-cni-dir\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:03.928298 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928074 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mptvn\" (UniqueName: \"kubernetes.io/projected/7b9f858d-0cf6-49d9-8632-23d2d0584e24-kube-api-access-mptvn\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:03.928298 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928099 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-host-run-netns\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:03.928298 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928125 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-lib-modules\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:03.928298 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928151 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2vrm\" (UniqueName: \"kubernetes.io/projected/4d1027fb-9f71-4cf4-b0db-1f2916e50320-kube-api-access-c2vrm\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:03.928298 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928192 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7b9f858d-0cf6-49d9-8632-23d2d0584e24-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:03.928298 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928231 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-run-ovn\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:03.928298 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928253 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7d5b\" (UniqueName: \"kubernetes.io/projected/ed377958-ce5b-41c7-9512-4b95b799767d-kube-api-access-z7d5b\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:03.928298 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928276 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-etc-modprobe-d\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:03.928734 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928310 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6de78603-c646-45a5-8bc4-9cfc56456d0f-serviceca\") pod \"node-ca-xfl2x\" (UID: \"6de78603-c646-45a5-8bc4-9cfc56456d0f\") " pod="openshift-image-registry/node-ca-xfl2x" Apr 21 01:50:03.928734 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928333 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-multus-cni-dir\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:03.928734 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928357 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/214bcf67-a154-4c72-a914-d9efa8bdfee9-multus-daemon-config\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:03.928734 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928382 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-etc-openvswitch\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:03.928734 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928404 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ed377958-ce5b-41c7-9512-4b95b799767d-env-overrides\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:03.928734 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928439 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f7e3c291-0094-4e7c-9d88-7f0864c79c8d-sys-fs\") pod \"aws-ebs-csi-driver-node-cd5f6\" (UID: \"f7e3c291-0094-4e7c-9d88-7f0864c79c8d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:03.928734 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928463 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ed377958-ce5b-41c7-9512-4b95b799767d-ovnkube-config\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:03.928734 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928486 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49fp5\" (UniqueName: \"kubernetes.io/projected/cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0-kube-api-access-49fp5\") pod \"node-resolver-w4c4r\" (UID: \"cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0\") " pod="openshift-dns/node-resolver-w4c4r" Apr 21 01:50:03.928734 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928509 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6de78603-c646-45a5-8bc4-9cfc56456d0f-host\") pod \"node-ca-xfl2x\" (UID: \"6de78603-c646-45a5-8bc4-9cfc56456d0f\") " pod="openshift-image-registry/node-ca-xfl2x" Apr 21 01:50:03.928734 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928531 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7b9f858d-0cf6-49d9-8632-23d2d0584e24-cni-binary-copy\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:03.928734 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928556 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cacbe46e-f843-4ec6-b2e4-222e2ab51feb-host-slash\") pod \"iptables-alerter-2wlcr\" (UID: \"cacbe46e-f843-4ec6-b2e4-222e2ab51feb\") " pod="openshift-network-operator/iptables-alerter-2wlcr" Apr 21 01:50:03.928734 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928599 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-node-log\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:03.928734 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928633 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f7e3c291-0094-4e7c-9d88-7f0864c79c8d-registration-dir\") pod \"aws-ebs-csi-driver-node-cd5f6\" (UID: \"f7e3c291-0094-4e7c-9d88-7f0864c79c8d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:03.928734 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928662 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7b9f858d-0cf6-49d9-8632-23d2d0584e24-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:03.928734 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928685 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-host-kubelet\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:03.928734 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928726 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f7e3c291-0094-4e7c-9d88-7f0864c79c8d-etc-selinux\") pod \"aws-ebs-csi-driver-node-cd5f6\" (UID: \"f7e3c291-0094-4e7c-9d88-7f0864c79c8d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:03.929429 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928765 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-var-lib-kubelet\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:03.929429 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928793 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/25059272-61bd-4d87-9141-9036eaa06ce3-konnectivity-ca\") pod \"konnectivity-agent-l94wt\" (UID: \"25059272-61bd-4d87-9141-9036eaa06ce3\") " pod="kube-system/konnectivity-agent-l94wt" Apr 21 01:50:03.929429 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928856 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-system-cni-dir\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:03.929429 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928877 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7b9f858d-0cf6-49d9-8632-23d2d0584e24-os-release\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:03.929429 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928895 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-multus-conf-dir\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:03.929429 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928921 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cacbe46e-f843-4ec6-b2e4-222e2ab51feb-iptables-alerter-script\") pod \"iptables-alerter-2wlcr\" (UID: \"cacbe46e-f843-4ec6-b2e4-222e2ab51feb\") " pod="openshift-network-operator/iptables-alerter-2wlcr" Apr 21 01:50:03.929429 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928936 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-host\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:03.929429 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928950 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/214bcf67-a154-4c72-a914-d9efa8bdfee9-cni-binary-copy\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:03.929429 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928967 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-host-run-k8s-cni-cncf-io\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:03.929429 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.928990 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-etc-systemd\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:03.929429 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929010 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-sys\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:03.929429 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929025 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/25059272-61bd-4d87-9141-9036eaa06ce3-agent-certs\") pod \"konnectivity-agent-l94wt\" (UID: \"25059272-61bd-4d87-9141-9036eaa06ce3\") " pod="kube-system/konnectivity-agent-l94wt" Apr 21 01:50:03.929429 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929039 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7b9f858d-0cf6-49d9-8632-23d2d0584e24-cnibin\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:03.929429 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929052 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-run-systemd\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:03.929429 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929072 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:03.929429 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929119 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-etc-sysconfig\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:03.930029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929154 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-etc-kubernetes\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:03.930029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929183 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-run-openvswitch\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:03.930029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929208 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-host-run-ovn-kubernetes\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:03.930029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929231 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4d1027fb-9f71-4cf4-b0db-1f2916e50320-etc-tuned\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:03.930029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929257 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0-tmp-dir\") pod \"node-resolver-w4c4r\" (UID: \"cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0\") " pod="openshift-dns/node-resolver-w4c4r" Apr 21 01:50:03.930029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929281 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmb9g\" (UniqueName: \"kubernetes.io/projected/214bcf67-a154-4c72-a914-d9efa8bdfee9-kube-api-access-cmb9g\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:03.930029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929307 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gzqh\" (UniqueName: \"kubernetes.io/projected/cacbe46e-f843-4ec6-b2e4-222e2ab51feb-kube-api-access-6gzqh\") pod \"iptables-alerter-2wlcr\" (UID: \"cacbe46e-f843-4ec6-b2e4-222e2ab51feb\") " pod="openshift-network-operator/iptables-alerter-2wlcr" Apr 21 01:50:03.930029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929332 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-host-cni-bin\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:03.930029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929356 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ed377958-ce5b-41c7-9512-4b95b799767d-ovnkube-script-lib\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:03.930029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929394 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f7e3c291-0094-4e7c-9d88-7f0864c79c8d-socket-dir\") pod \"aws-ebs-csi-driver-node-cd5f6\" (UID: \"f7e3c291-0094-4e7c-9d88-7f0864c79c8d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:03.930029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929420 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-hostroot\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:03.930029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929442 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-host-cni-netd\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:03.930029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929464 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbf8p\" (UniqueName: \"kubernetes.io/projected/f7e3c291-0094-4e7c-9d88-7f0864c79c8d-kube-api-access-vbf8p\") pod \"aws-ebs-csi-driver-node-cd5f6\" (UID: \"f7e3c291-0094-4e7c-9d88-7f0864c79c8d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:03.930029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929483 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-etc-sysctl-d\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:03.930029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929498 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d1027fb-9f71-4cf4-b0db-1f2916e50320-tmp\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:03.930029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929555 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0-hosts-file\") pod \"node-resolver-w4c4r\" (UID: \"cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0\") " pod="openshift-dns/node-resolver-w4c4r" Apr 21 01:50:03.930510 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929582 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-os-release\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:03.930510 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929603 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-multus-socket-dir-parent\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:03.930510 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929624 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-host-run-multus-certs\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:03.930510 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929647 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-var-lib-openvswitch\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:03.930510 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929668 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f7e3c291-0094-4e7c-9d88-7f0864c79c8d-device-dir\") pod \"aws-ebs-csi-driver-node-cd5f6\" (UID: \"f7e3c291-0094-4e7c-9d88-7f0864c79c8d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:03.930510 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.929692 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7b9f858d-0cf6-49d9-8632-23d2d0584e24-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:03.937350 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.937325 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 01:50:03.961198 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.961166 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-p6swl" Apr 21 01:50:03.973227 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:03.973200 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-p6swl" Apr 21 01:50:04.024944 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.024922 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 01:50:04.025735 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:04.025707 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc7f96eee01b7ac864ac98f5fb6e45b4.slice/crio-50f75d479825e2edf10ab94cdfaa7f461ca603536eb38cfdd985f40d5f61454b WatchSource:0}: Error finding container 50f75d479825e2edf10ab94cdfaa7f461ca603536eb38cfdd985f40d5f61454b: Status 404 returned error can't find the container with id 50f75d479825e2edf10ab94cdfaa7f461ca603536eb38cfdd985f40d5f61454b Apr 21 01:50:04.025979 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:04.025957 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod053154a49e4e985b6b9ca90f37194bef.slice/crio-fb9079a563f7112fb1aadff3e0480ffaca6c8735329b356721b87c8997024ec0 WatchSource:0}: Error finding container fb9079a563f7112fb1aadff3e0480ffaca6c8735329b356721b87c8997024ec0: Status 404 returned error can't find the container with id fb9079a563f7112fb1aadff3e0480ffaca6c8735329b356721b87c8997024ec0 Apr 21 01:50:04.029927 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.029900 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-host-run-netns\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.030006 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.029945 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mm6p6\" (UniqueName: \"kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6\") pod \"network-check-target-bvqj7\" (UID: \"560db137-e262-4c6c-9380-c422a8537e5e\") " pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:04.030006 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.029973 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8bvh\" (UniqueName: \"kubernetes.io/projected/6de78603-c646-45a5-8bc4-9cfc56456d0f-kube-api-access-v8bvh\") pod \"node-ca-xfl2x\" (UID: \"6de78603-c646-45a5-8bc4-9cfc56456d0f\") " pod="openshift-image-registry/node-ca-xfl2x" Apr 21 01:50:04.030006 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.029981 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-host-run-netns\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.030006 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.029997 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-cnibin\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.030197 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030021 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-host-var-lib-cni-bin\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.030197 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030045 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-etc-kubernetes\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.030197 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030069 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-systemd-units\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.030197 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030113 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-host-var-lib-cni-bin\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.030197 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030122 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-etc-kubernetes\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.030197 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030122 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-run\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.030197 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030152 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-systemd-units\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.030197 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030171 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b9f858d-0cf6-49d9-8632-23d2d0584e24-system-cni-dir\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:04.030197 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030182 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-run\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.030197 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030199 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mptvn\" (UniqueName: \"kubernetes.io/projected/7b9f858d-0cf6-49d9-8632-23d2d0584e24-kube-api-access-mptvn\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:04.030650 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030225 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7b9f858d-0cf6-49d9-8632-23d2d0584e24-system-cni-dir\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:04.030650 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030228 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-host-run-netns\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.030650 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030265 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-host-run-netns\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.030650 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030270 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-lib-modules\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.030650 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030303 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2vrm\" (UniqueName: \"kubernetes.io/projected/4d1027fb-9f71-4cf4-b0db-1f2916e50320-kube-api-access-c2vrm\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.030650 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030319 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7b9f858d-0cf6-49d9-8632-23d2d0584e24-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:04.030650 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030335 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-run-ovn\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.030650 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030356 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7d5b\" (UniqueName: \"kubernetes.io/projected/ed377958-ce5b-41c7-9512-4b95b799767d-kube-api-access-z7d5b\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.030650 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030371 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-etc-modprobe-d\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.030650 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030380 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-lib-modules\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.030650 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030385 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6de78603-c646-45a5-8bc4-9cfc56456d0f-serviceca\") pod \"node-ca-xfl2x\" (UID: \"6de78603-c646-45a5-8bc4-9cfc56456d0f\") " pod="openshift-image-registry/node-ca-xfl2x" Apr 21 01:50:04.030650 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030463 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-multus-cni-dir\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.030650 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030495 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/214bcf67-a154-4c72-a914-d9efa8bdfee9-multus-daemon-config\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.030650 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030520 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-etc-openvswitch\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.030650 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030542 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7b9f858d-0cf6-49d9-8632-23d2d0584e24-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:04.030650 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030544 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ed377958-ce5b-41c7-9512-4b95b799767d-env-overrides\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.030650 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030561 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-multus-cni-dir\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.031440 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030599 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f7e3c291-0094-4e7c-9d88-7f0864c79c8d-sys-fs\") pod \"aws-ebs-csi-driver-node-cd5f6\" (UID: \"f7e3c291-0094-4e7c-9d88-7f0864c79c8d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:04.031440 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030614 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-etc-openvswitch\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.031440 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030631 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs\") pod \"network-metrics-daemon-mfs4c\" (UID: \"9c103689-40cc-470b-9109-33a63ff6f5dd\") " pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:04.031440 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030680 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-run-ovn\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.031440 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030698 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ed377958-ce5b-41c7-9512-4b95b799767d-ovnkube-config\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.031440 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030732 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49fp5\" (UniqueName: \"kubernetes.io/projected/cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0-kube-api-access-49fp5\") pod \"node-resolver-w4c4r\" (UID: \"cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0\") " pod="openshift-dns/node-resolver-w4c4r" Apr 21 01:50:04.031440 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030763 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6de78603-c646-45a5-8bc4-9cfc56456d0f-host\") pod \"node-ca-xfl2x\" (UID: \"6de78603-c646-45a5-8bc4-9cfc56456d0f\") " pod="openshift-image-registry/node-ca-xfl2x" Apr 21 01:50:04.031440 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030784 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f7e3c291-0094-4e7c-9d88-7f0864c79c8d-sys-fs\") pod \"aws-ebs-csi-driver-node-cd5f6\" (UID: \"f7e3c291-0094-4e7c-9d88-7f0864c79c8d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:04.031440 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030797 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-etc-modprobe-d\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.031440 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030788 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7b9f858d-0cf6-49d9-8632-23d2d0584e24-cni-binary-copy\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:04.031440 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030855 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cacbe46e-f843-4ec6-b2e4-222e2ab51feb-host-slash\") pod \"iptables-alerter-2wlcr\" (UID: \"cacbe46e-f843-4ec6-b2e4-222e2ab51feb\") " pod="openshift-network-operator/iptables-alerter-2wlcr" Apr 21 01:50:04.031440 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030884 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-node-log\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.031440 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030884 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-cnibin\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.031440 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030912 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f7e3c291-0094-4e7c-9d88-7f0864c79c8d-registration-dir\") pod \"aws-ebs-csi-driver-node-cd5f6\" (UID: \"f7e3c291-0094-4e7c-9d88-7f0864c79c8d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:04.031440 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030937 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cacbe46e-f843-4ec6-b2e4-222e2ab51feb-host-slash\") pod \"iptables-alerter-2wlcr\" (UID: \"cacbe46e-f843-4ec6-b2e4-222e2ab51feb\") " pod="openshift-network-operator/iptables-alerter-2wlcr" Apr 21 01:50:04.031440 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030940 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7b9f858d-0cf6-49d9-8632-23d2d0584e24-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:04.031440 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.030978 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-node-log\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.032296 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031004 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-host-kubelet\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.032296 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031029 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f7e3c291-0094-4e7c-9d88-7f0864c79c8d-etc-selinux\") pod \"aws-ebs-csi-driver-node-cd5f6\" (UID: \"f7e3c291-0094-4e7c-9d88-7f0864c79c8d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:04.032296 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031054 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-var-lib-kubelet\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.032296 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031078 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/25059272-61bd-4d87-9141-9036eaa06ce3-konnectivity-ca\") pod \"konnectivity-agent-l94wt\" (UID: \"25059272-61bd-4d87-9141-9036eaa06ce3\") " pod="kube-system/konnectivity-agent-l94wt" Apr 21 01:50:04.032296 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031103 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-system-cni-dir\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.032296 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031132 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7wlr\" (UniqueName: \"kubernetes.io/projected/9c103689-40cc-470b-9109-33a63ff6f5dd-kube-api-access-t7wlr\") pod \"network-metrics-daemon-mfs4c\" (UID: \"9c103689-40cc-470b-9109-33a63ff6f5dd\") " pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:04.032296 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031160 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7b9f858d-0cf6-49d9-8632-23d2d0584e24-os-release\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:04.032296 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031185 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-multus-conf-dir\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.032296 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031199 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6de78603-c646-45a5-8bc4-9cfc56456d0f-serviceca\") pod \"node-ca-xfl2x\" (UID: \"6de78603-c646-45a5-8bc4-9cfc56456d0f\") " pod="openshift-image-registry/node-ca-xfl2x" Apr 21 01:50:04.032296 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031207 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ed377958-ce5b-41c7-9512-4b95b799767d-env-overrides\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.032296 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031209 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cacbe46e-f843-4ec6-b2e4-222e2ab51feb-iptables-alerter-script\") pod \"iptables-alerter-2wlcr\" (UID: \"cacbe46e-f843-4ec6-b2e4-222e2ab51feb\") " pod="openshift-network-operator/iptables-alerter-2wlcr" Apr 21 01:50:04.032296 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031279 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-host\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.032296 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031302 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/214bcf67-a154-4c72-a914-d9efa8bdfee9-cni-binary-copy\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.032296 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031318 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-host-run-k8s-cni-cncf-io\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.032296 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031334 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-etc-systemd\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.032296 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031348 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-sys\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.032296 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031363 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/25059272-61bd-4d87-9141-9036eaa06ce3-agent-certs\") pod \"konnectivity-agent-l94wt\" (UID: \"25059272-61bd-4d87-9141-9036eaa06ce3\") " pod="kube-system/konnectivity-agent-l94wt" Apr 21 01:50:04.033007 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031380 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7b9f858d-0cf6-49d9-8632-23d2d0584e24-cnibin\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:04.033007 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031396 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-run-systemd\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.033007 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031413 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.033007 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031430 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-etc-sysconfig\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.033007 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031445 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-etc-kubernetes\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.033007 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031448 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7b9f858d-0cf6-49d9-8632-23d2d0584e24-cni-binary-copy\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:04.033007 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031451 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7b9f858d-0cf6-49d9-8632-23d2d0584e24-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:04.033007 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031465 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-run-openvswitch\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.033007 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031525 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-host-run-ovn-kubernetes\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.033007 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031620 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f7e3c291-0094-4e7c-9d88-7f0864c79c8d-etc-selinux\") pod \"aws-ebs-csi-driver-node-cd5f6\" (UID: \"f7e3c291-0094-4e7c-9d88-7f0864c79c8d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:04.033007 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031627 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7b9f858d-0cf6-49d9-8632-23d2d0584e24-cnibin\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:04.033007 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031664 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-var-lib-kubelet\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.033007 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031690 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-run-openvswitch\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.033007 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031704 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/cacbe46e-f843-4ec6-b2e4-222e2ab51feb-iptables-alerter-script\") pod \"iptables-alerter-2wlcr\" (UID: \"cacbe46e-f843-4ec6-b2e4-222e2ab51feb\") " pod="openshift-network-operator/iptables-alerter-2wlcr" Apr 21 01:50:04.033007 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031728 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.033007 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031752 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-host-run-ovn-kubernetes\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.033007 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031757 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-host-run-k8s-cni-cncf-io\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.033861 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031789 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4d1027fb-9f71-4cf4-b0db-1f2916e50320-etc-tuned\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.033861 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031798 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-etc-sysconfig\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.033861 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031837 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0-tmp-dir\") pod \"node-resolver-w4c4r\" (UID: \"cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0\") " pod="openshift-dns/node-resolver-w4c4r" Apr 21 01:50:04.033861 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031853 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-etc-kubernetes\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.033861 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031866 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmb9g\" (UniqueName: \"kubernetes.io/projected/214bcf67-a154-4c72-a914-d9efa8bdfee9-kube-api-access-cmb9g\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.033861 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031893 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gzqh\" (UniqueName: \"kubernetes.io/projected/cacbe46e-f843-4ec6-b2e4-222e2ab51feb-kube-api-access-6gzqh\") pod \"iptables-alerter-2wlcr\" (UID: \"cacbe46e-f843-4ec6-b2e4-222e2ab51feb\") " pod="openshift-network-operator/iptables-alerter-2wlcr" Apr 21 01:50:04.033861 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031903 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7b9f858d-0cf6-49d9-8632-23d2d0584e24-os-release\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:04.033861 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031918 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-host-cni-bin\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.033861 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031925 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 01:50:04.033861 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ed377958-ce5b-41c7-9512-4b95b799767d-ovnkube-script-lib\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.033861 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031969 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f7e3c291-0094-4e7c-9d88-7f0864c79c8d-socket-dir\") pod \"aws-ebs-csi-driver-node-cd5f6\" (UID: \"f7e3c291-0094-4e7c-9d88-7f0864c79c8d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:04.033861 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031993 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-hostroot\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.033861 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032056 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-host-cni-netd\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.033861 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032084 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vbf8p\" (UniqueName: \"kubernetes.io/projected/f7e3c291-0094-4e7c-9d88-7f0864c79c8d-kube-api-access-vbf8p\") pod \"aws-ebs-csi-driver-node-cd5f6\" (UID: \"f7e3c291-0094-4e7c-9d88-7f0864c79c8d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:04.033861 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032113 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-etc-sysctl-d\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.033861 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032122 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/214bcf67-a154-4c72-a914-d9efa8bdfee9-cni-binary-copy\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.033861 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032139 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d1027fb-9f71-4cf4-b0db-1f2916e50320-tmp\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.033861 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032143 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ed377958-ce5b-41c7-9512-4b95b799767d-ovnkube-config\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.034951 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032167 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0-hosts-file\") pod \"node-resolver-w4c4r\" (UID: \"cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0\") " pod="openshift-dns/node-resolver-w4c4r" Apr 21 01:50:04.034951 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032187 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-system-cni-dir\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.034951 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032197 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-multus-conf-dir\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.034951 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032201 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-os-release\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.034951 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032223 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-run-systemd\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.034951 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032227 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-multus-socket-dir-parent\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.034951 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032254 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-host-run-multus-certs\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.034951 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032277 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0-tmp-dir\") pod \"node-resolver-w4c4r\" (UID: \"cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0\") " pod="openshift-dns/node-resolver-w4c4r" Apr 21 01:50:04.034951 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032282 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-host-cni-netd\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.034951 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032325 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-hostroot\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.034951 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032338 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-etc-systemd\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.034951 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032369 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6de78603-c646-45a5-8bc4-9cfc56456d0f-host\") pod \"node-ca-xfl2x\" (UID: \"6de78603-c646-45a5-8bc4-9cfc56456d0f\") " pod="openshift-image-registry/node-ca-xfl2x" Apr 21 01:50:04.034951 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.031623 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f7e3c291-0094-4e7c-9d88-7f0864c79c8d-registration-dir\") pod \"aws-ebs-csi-driver-node-cd5f6\" (UID: \"f7e3c291-0094-4e7c-9d88-7f0864c79c8d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:04.034951 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032386 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0-hosts-file\") pod \"node-resolver-w4c4r\" (UID: \"cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0\") " pod="openshift-dns/node-resolver-w4c4r" Apr 21 01:50:04.034951 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032410 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-sys\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.034951 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032409 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-host-kubelet\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.034951 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032442 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-os-release\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.034951 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032484 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-host\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.035730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032518 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-host-cni-bin\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.035730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032521 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f7e3c291-0094-4e7c-9d88-7f0864c79c8d-socket-dir\") pod \"aws-ebs-csi-driver-node-cd5f6\" (UID: \"f7e3c291-0094-4e7c-9d88-7f0864c79c8d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:04.035730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.032897 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-multus-socket-dir-parent\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.035730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033085 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ed377958-ce5b-41c7-9512-4b95b799767d-ovnkube-script-lib\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.035730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033141 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-host-run-multus-certs\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.035730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033277 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-etc-sysctl-d\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.035730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033330 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-var-lib-openvswitch\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.035730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033372 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f7e3c291-0094-4e7c-9d88-7f0864c79c8d-device-dir\") pod \"aws-ebs-csi-driver-node-cd5f6\" (UID: \"f7e3c291-0094-4e7c-9d88-7f0864c79c8d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:04.035730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033400 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7b9f858d-0cf6-49d9-8632-23d2d0584e24-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:04.035730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033448 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-host-var-lib-cni-multus\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.035730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033474 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-host-var-lib-kubelet\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.035730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033482 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f7e3c291-0094-4e7c-9d88-7f0864c79c8d-device-dir\") pod \"aws-ebs-csi-driver-node-cd5f6\" (UID: \"f7e3c291-0094-4e7c-9d88-7f0864c79c8d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:04.035730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033505 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-host-slash\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.035730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033550 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-var-lib-openvswitch\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.035730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033548 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/214bcf67-a154-4c72-a914-d9efa8bdfee9-multus-daemon-config\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.035730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033556 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-host-var-lib-cni-multus\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.035730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033584 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/214bcf67-a154-4c72-a914-d9efa8bdfee9-host-var-lib-kubelet\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.036554 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033617 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f7e3c291-0094-4e7c-9d88-7f0864c79c8d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cd5f6\" (UID: \"f7e3c291-0094-4e7c-9d88-7f0864c79c8d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:04.036554 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033646 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-etc-sysctl-conf\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.036554 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033671 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f7e3c291-0094-4e7c-9d88-7f0864c79c8d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cd5f6\" (UID: \"f7e3c291-0094-4e7c-9d88-7f0864c79c8d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:04.036554 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033689 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-host-slash\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.036554 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033705 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-log-socket\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.036554 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033745 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ed377958-ce5b-41c7-9512-4b95b799767d-ovn-node-metrics-cert\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.036554 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033786 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4d1027fb-9f71-4cf4-b0db-1f2916e50320-etc-sysctl-conf\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.036554 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.033852 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ed377958-ce5b-41c7-9512-4b95b799767d-log-socket\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.036554 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.034784 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7b9f858d-0cf6-49d9-8632-23d2d0584e24-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:04.036554 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.035078 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 01:50:04.036554 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.035332 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4d1027fb-9f71-4cf4-b0db-1f2916e50320-etc-tuned\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.036554 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.035355 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4d1027fb-9f71-4cf4-b0db-1f2916e50320-tmp\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.036554 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.035670 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/25059272-61bd-4d87-9141-9036eaa06ce3-agent-certs\") pod \"konnectivity-agent-l94wt\" (UID: \"25059272-61bd-4d87-9141-9036eaa06ce3\") " pod="kube-system/konnectivity-agent-l94wt" Apr 21 01:50:04.036554 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.036065 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ed377958-ce5b-41c7-9512-4b95b799767d-ovn-node-metrics-cert\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.036554 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.036350 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/25059272-61bd-4d87-9141-9036eaa06ce3-konnectivity-ca\") pod \"konnectivity-agent-l94wt\" (UID: \"25059272-61bd-4d87-9141-9036eaa06ce3\") " pod="kube-system/konnectivity-agent-l94wt" Apr 21 01:50:04.036554 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:04.036507 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 01:50:04.036554 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:04.036527 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 01:50:04.036554 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:04.036541 2573 projected.go:194] Error preparing data for projected volume kube-api-access-mm6p6 for pod openshift-network-diagnostics/network-check-target-bvqj7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:04.037559 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:04.036643 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6 podName:560db137-e262-4c6c-9380-c422a8537e5e nodeName:}" failed. No retries permitted until 2026-04-21 01:50:04.536590519 +0000 UTC m=+2.115892581 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mm6p6" (UniqueName: "kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6") pod "network-check-target-bvqj7" (UID: "560db137-e262-4c6c-9380-c422a8537e5e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:04.038586 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.038518 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mptvn\" (UniqueName: \"kubernetes.io/projected/7b9f858d-0cf6-49d9-8632-23d2d0584e24-kube-api-access-mptvn\") pod \"multus-additional-cni-plugins-8h5m5\" (UID: \"7b9f858d-0cf6-49d9-8632-23d2d0584e24\") " pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:04.038944 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.038898 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2vrm\" (UniqueName: \"kubernetes.io/projected/4d1027fb-9f71-4cf4-b0db-1f2916e50320-kube-api-access-c2vrm\") pod \"tuned-l9kzx\" (UID: \"4d1027fb-9f71-4cf4-b0db-1f2916e50320\") " pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.039328 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.039309 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8bvh\" (UniqueName: \"kubernetes.io/projected/6de78603-c646-45a5-8bc4-9cfc56456d0f-kube-api-access-v8bvh\") pod \"node-ca-xfl2x\" (UID: \"6de78603-c646-45a5-8bc4-9cfc56456d0f\") " pod="openshift-image-registry/node-ca-xfl2x" Apr 21 01:50:04.039862 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.039842 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7d5b\" (UniqueName: \"kubernetes.io/projected/ed377958-ce5b-41c7-9512-4b95b799767d-kube-api-access-z7d5b\") pod \"ovnkube-node-f2tdx\" (UID: \"ed377958-ce5b-41c7-9512-4b95b799767d\") " pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.040396 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.040376 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gzqh\" (UniqueName: \"kubernetes.io/projected/cacbe46e-f843-4ec6-b2e4-222e2ab51feb-kube-api-access-6gzqh\") pod \"iptables-alerter-2wlcr\" (UID: \"cacbe46e-f843-4ec6-b2e4-222e2ab51feb\") " pod="openshift-network-operator/iptables-alerter-2wlcr" Apr 21 01:50:04.043786 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.041081 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmb9g\" (UniqueName: \"kubernetes.io/projected/214bcf67-a154-4c72-a914-d9efa8bdfee9-kube-api-access-cmb9g\") pod \"multus-fdckn\" (UID: \"214bcf67-a154-4c72-a914-d9efa8bdfee9\") " pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.043786 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.041513 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49fp5\" (UniqueName: \"kubernetes.io/projected/cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0-kube-api-access-49fp5\") pod \"node-resolver-w4c4r\" (UID: \"cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0\") " pod="openshift-dns/node-resolver-w4c4r" Apr 21 01:50:04.043786 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.042203 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbf8p\" (UniqueName: \"kubernetes.io/projected/f7e3c291-0094-4e7c-9d88-7f0864c79c8d-kube-api-access-vbf8p\") pod \"aws-ebs-csi-driver-node-cd5f6\" (UID: \"f7e3c291-0094-4e7c-9d88-7f0864c79c8d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:04.045523 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.045505 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-fdckn" Apr 21 01:50:04.051857 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.051840 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-l94wt" Apr 21 01:50:04.053493 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.053475 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w4c4r" Apr 21 01:50:04.054194 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:04.054157 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod214bcf67_a154_4c72_a914_d9efa8bdfee9.slice/crio-86a05dc7f024f5d27545c12f089b167c488905ea6f4d255c534ef75f37d99d0c WatchSource:0}: Error finding container 86a05dc7f024f5d27545c12f089b167c488905ea6f4d255c534ef75f37d99d0c: Status 404 returned error can't find the container with id 86a05dc7f024f5d27545c12f089b167c488905ea6f4d255c534ef75f37d99d0c Apr 21 01:50:04.061031 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.060990 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-52.ec2.internal" event={"ID":"053154a49e4e985b6b9ca90f37194bef","Type":"ContainerStarted","Data":"fb9079a563f7112fb1aadff3e0480ffaca6c8735329b356721b87c8997024ec0"} Apr 21 01:50:04.061385 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:04.061365 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25059272_61bd_4d87_9141_9036eaa06ce3.slice/crio-5fbea7787ede8d8dd97a4a67aee826ee4f389964c62fb88f1bdd4a68441b8344 WatchSource:0}: Error finding container 5fbea7787ede8d8dd97a4a67aee826ee4f389964c62fb88f1bdd4a68441b8344: Status 404 returned error can't find the container with id 5fbea7787ede8d8dd97a4a67aee826ee4f389964c62fb88f1bdd4a68441b8344 Apr 21 01:50:04.062022 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.061989 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fdckn" event={"ID":"214bcf67-a154-4c72-a914-d9efa8bdfee9","Type":"ContainerStarted","Data":"86a05dc7f024f5d27545c12f089b167c488905ea6f4d255c534ef75f37d99d0c"} Apr 21 01:50:04.062918 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.062891 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-52.ec2.internal" event={"ID":"bc7f96eee01b7ac864ac98f5fb6e45b4","Type":"ContainerStarted","Data":"50f75d479825e2edf10ab94cdfaa7f461ca603536eb38cfdd985f40d5f61454b"} Apr 21 01:50:04.064394 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:04.064375 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc11073d_4e8d_4b5b_b1bd_40ad61aa03b0.slice/crio-6036f3c7a5f5b09e0c390a645946b1e8c7c018e55000bc83ab0ced0347ea2e9c WatchSource:0}: Error finding container 6036f3c7a5f5b09e0c390a645946b1e8c7c018e55000bc83ab0ced0347ea2e9c: Status 404 returned error can't find the container with id 6036f3c7a5f5b09e0c390a645946b1e8c7c018e55000bc83ab0ced0347ea2e9c Apr 21 01:50:04.134180 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.134085 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs\") pod \"network-metrics-daemon-mfs4c\" (UID: \"9c103689-40cc-470b-9109-33a63ff6f5dd\") " pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:04.134180 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.134150 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7wlr\" (UniqueName: \"kubernetes.io/projected/9c103689-40cc-470b-9109-33a63ff6f5dd-kube-api-access-t7wlr\") pod \"network-metrics-daemon-mfs4c\" (UID: \"9c103689-40cc-470b-9109-33a63ff6f5dd\") " pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:04.134405 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:04.134297 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:04.134405 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:04.134368 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs podName:9c103689-40cc-470b-9109-33a63ff6f5dd nodeName:}" failed. No retries permitted until 2026-04-21 01:50:04.634345225 +0000 UTC m=+2.213647282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs") pod "network-metrics-daemon-mfs4c" (UID: "9c103689-40cc-470b-9109-33a63ff6f5dd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:04.142240 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.142213 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7wlr\" (UniqueName: \"kubernetes.io/projected/9c103689-40cc-470b-9109-33a63ff6f5dd-kube-api-access-t7wlr\") pod \"network-metrics-daemon-mfs4c\" (UID: \"9c103689-40cc-470b-9109-33a63ff6f5dd\") " pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:04.241259 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.241229 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2wlcr" Apr 21 01:50:04.247493 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:04.247465 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcacbe46e_f843_4ec6_b2e4_222e2ab51feb.slice/crio-4101a3cfee6246dabf5575459412112e454a171aa016b410b1d040d6375d53ef WatchSource:0}: Error finding container 4101a3cfee6246dabf5575459412112e454a171aa016b410b1d040d6375d53ef: Status 404 returned error can't find the container with id 4101a3cfee6246dabf5575459412112e454a171aa016b410b1d040d6375d53ef Apr 21 01:50:04.250362 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.250318 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8h5m5" Apr 21 01:50:04.256414 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:04.256384 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b9f858d_0cf6_49d9_8632_23d2d0584e24.slice/crio-4bd3a6896d1f9358e63554d3072b4aeaac02bf2677a75eb2251006aea82b51a0 WatchSource:0}: Error finding container 4bd3a6896d1f9358e63554d3072b4aeaac02bf2677a75eb2251006aea82b51a0: Status 404 returned error can't find the container with id 4bd3a6896d1f9358e63554d3072b4aeaac02bf2677a75eb2251006aea82b51a0 Apr 21 01:50:04.269069 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.269038 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:04.276062 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:04.276023 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded377958_ce5b_41c7_9512_4b95b799767d.slice/crio-378a1cd80237a05563bdad37cbd8c20b1547ff620afa8a7e3f6dc1c9b89850d9 WatchSource:0}: Error finding container 378a1cd80237a05563bdad37cbd8c20b1547ff620afa8a7e3f6dc1c9b89850d9: Status 404 returned error can't find the container with id 378a1cd80237a05563bdad37cbd8c20b1547ff620afa8a7e3f6dc1c9b89850d9 Apr 21 01:50:04.288858 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.288827 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" Apr 21 01:50:04.295275 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:04.295239 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7e3c291_0094_4e7c_9d88_7f0864c79c8d.slice/crio-6f279f73ef1891a56a33fd4bf96eeb0ee6a371820678254096935c9e9a76bd58 WatchSource:0}: Error finding container 6f279f73ef1891a56a33fd4bf96eeb0ee6a371820678254096935c9e9a76bd58: Status 404 returned error can't find the container with id 6f279f73ef1891a56a33fd4bf96eeb0ee6a371820678254096935c9e9a76bd58 Apr 21 01:50:04.308280 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.308253 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" Apr 21 01:50:04.314509 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:04.314476 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d1027fb_9f71_4cf4_b0db_1f2916e50320.slice/crio-912d22f7748919e4a1f0ac9594db518cfb6850e7c0391862759fa8a71e931b6d WatchSource:0}: Error finding container 912d22f7748919e4a1f0ac9594db518cfb6850e7c0391862759fa8a71e931b6d: Status 404 returned error can't find the container with id 912d22f7748919e4a1f0ac9594db518cfb6850e7c0391862759fa8a71e931b6d Apr 21 01:50:04.335766 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.335735 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xfl2x" Apr 21 01:50:04.341713 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:50:04.341682 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6de78603_c646_45a5_8bc4_9cfc56456d0f.slice/crio-e6e0028372cdb769d5610e475b6363520272c4fd840eda5f2252387c67f6dec8 WatchSource:0}: Error finding container e6e0028372cdb769d5610e475b6363520272c4fd840eda5f2252387c67f6dec8: Status 404 returned error can't find the container with id e6e0028372cdb769d5610e475b6363520272c4fd840eda5f2252387c67f6dec8 Apr 21 01:50:04.536707 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.536624 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mm6p6\" (UniqueName: \"kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6\") pod \"network-check-target-bvqj7\" (UID: \"560db137-e262-4c6c-9380-c422a8537e5e\") " pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:04.536880 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:04.536845 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 01:50:04.536880 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:04.536866 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 01:50:04.536880 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:04.536879 2573 projected.go:194] Error preparing data for projected volume kube-api-access-mm6p6 for pod openshift-network-diagnostics/network-check-target-bvqj7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:04.537048 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:04.536950 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6 podName:560db137-e262-4c6c-9380-c422a8537e5e nodeName:}" failed. No retries permitted until 2026-04-21 01:50:05.536920528 +0000 UTC m=+3.116222569 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-mm6p6" (UniqueName: "kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6") pod "network-check-target-bvqj7" (UID: "560db137-e262-4c6c-9380-c422a8537e5e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:04.638021 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.637971 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs\") pod \"network-metrics-daemon-mfs4c\" (UID: \"9c103689-40cc-470b-9109-33a63ff6f5dd\") " pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:04.638199 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:04.638170 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:04.638270 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:04.638254 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs podName:9c103689-40cc-470b-9109-33a63ff6f5dd nodeName:}" failed. No retries permitted until 2026-04-21 01:50:05.638215337 +0000 UTC m=+3.217517381 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs") pod "network-metrics-daemon-mfs4c" (UID: "9c103689-40cc-470b-9109-33a63ff6f5dd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:04.749148 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.748875 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 01:50:04.974402 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.974312 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 01:45:03 +0000 UTC" deadline="2027-11-07 20:20:50.746542149 +0000 UTC" Apr 21 01:50:04.974402 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:04.974350 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13578h30m45.772196331s" Apr 21 01:50:05.058849 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:05.058800 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:05.059039 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:05.058965 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mfs4c" podUID="9c103689-40cc-470b-9109-33a63ff6f5dd" Apr 21 01:50:05.084874 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:05.084469 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" event={"ID":"4d1027fb-9f71-4cf4-b0db-1f2916e50320","Type":"ContainerStarted","Data":"912d22f7748919e4a1f0ac9594db518cfb6850e7c0391862759fa8a71e931b6d"} Apr 21 01:50:05.090537 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:05.090502 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" event={"ID":"f7e3c291-0094-4e7c-9d88-7f0864c79c8d","Type":"ContainerStarted","Data":"6f279f73ef1891a56a33fd4bf96eeb0ee6a371820678254096935c9e9a76bd58"} Apr 21 01:50:05.099850 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:05.096983 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8h5m5" event={"ID":"7b9f858d-0cf6-49d9-8632-23d2d0584e24","Type":"ContainerStarted","Data":"4bd3a6896d1f9358e63554d3072b4aeaac02bf2677a75eb2251006aea82b51a0"} Apr 21 01:50:05.102555 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:05.102287 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 01:50:05.115904 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:05.115848 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-l94wt" event={"ID":"25059272-61bd-4d87-9141-9036eaa06ce3","Type":"ContainerStarted","Data":"5fbea7787ede8d8dd97a4a67aee826ee4f389964c62fb88f1bdd4a68441b8344"} Apr 21 01:50:05.148860 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:05.148807 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xfl2x" event={"ID":"6de78603-c646-45a5-8bc4-9cfc56456d0f","Type":"ContainerStarted","Data":"e6e0028372cdb769d5610e475b6363520272c4fd840eda5f2252387c67f6dec8"} Apr 21 01:50:05.162507 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:05.162470 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" event={"ID":"ed377958-ce5b-41c7-9512-4b95b799767d","Type":"ContainerStarted","Data":"378a1cd80237a05563bdad37cbd8c20b1547ff620afa8a7e3f6dc1c9b89850d9"} Apr 21 01:50:05.170658 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:05.170622 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2wlcr" event={"ID":"cacbe46e-f843-4ec6-b2e4-222e2ab51feb","Type":"ContainerStarted","Data":"4101a3cfee6246dabf5575459412112e454a171aa016b410b1d040d6375d53ef"} Apr 21 01:50:05.193723 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:05.193686 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w4c4r" event={"ID":"cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0","Type":"ContainerStarted","Data":"6036f3c7a5f5b09e0c390a645946b1e8c7c018e55000bc83ab0ced0347ea2e9c"} Apr 21 01:50:05.275958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:05.275880 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 01:50:05.546872 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:05.546179 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mm6p6\" (UniqueName: \"kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6\") pod \"network-check-target-bvqj7\" (UID: \"560db137-e262-4c6c-9380-c422a8537e5e\") " pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:05.546872 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:05.546335 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 01:50:05.546872 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:05.546352 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 01:50:05.546872 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:05.546366 2573 projected.go:194] Error preparing data for projected volume kube-api-access-mm6p6 for pod openshift-network-diagnostics/network-check-target-bvqj7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:05.546872 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:05.546423 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6 podName:560db137-e262-4c6c-9380-c422a8537e5e nodeName:}" failed. No retries permitted until 2026-04-21 01:50:07.546404488 +0000 UTC m=+5.125706528 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-mm6p6" (UniqueName: "kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6") pod "network-check-target-bvqj7" (UID: "560db137-e262-4c6c-9380-c422a8537e5e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:05.647348 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:05.647314 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs\") pod \"network-metrics-daemon-mfs4c\" (UID: \"9c103689-40cc-470b-9109-33a63ff6f5dd\") " pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:05.647525 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:05.647463 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:05.647583 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:05.647526 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs podName:9c103689-40cc-470b-9109-33a63ff6f5dd nodeName:}" failed. No retries permitted until 2026-04-21 01:50:07.647506799 +0000 UTC m=+5.226808844 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs") pod "network-metrics-daemon-mfs4c" (UID: "9c103689-40cc-470b-9109-33a63ff6f5dd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:05.975087 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:05.974993 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 01:45:03 +0000 UTC" deadline="2027-09-19 10:50:59.765228934 +0000 UTC" Apr 21 01:50:05.975087 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:05.975034 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12393h0m53.790198429s" Apr 21 01:50:06.059101 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:06.059069 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:06.059260 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:06.059223 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvqj7" podUID="560db137-e262-4c6c-9380-c422a8537e5e" Apr 21 01:50:07.059540 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:07.058778 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:07.059540 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:07.058935 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mfs4c" podUID="9c103689-40cc-470b-9109-33a63ff6f5dd" Apr 21 01:50:07.565194 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:07.565119 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mm6p6\" (UniqueName: \"kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6\") pod \"network-check-target-bvqj7\" (UID: \"560db137-e262-4c6c-9380-c422a8537e5e\") " pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:07.565422 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:07.565286 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 01:50:07.565422 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:07.565310 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 01:50:07.565422 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:07.565336 2573 projected.go:194] Error preparing data for projected volume kube-api-access-mm6p6 for pod openshift-network-diagnostics/network-check-target-bvqj7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:07.565422 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:07.565396 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6 podName:560db137-e262-4c6c-9380-c422a8537e5e nodeName:}" failed. No retries permitted until 2026-04-21 01:50:11.565378302 +0000 UTC m=+9.144680353 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-mm6p6" (UniqueName: "kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6") pod "network-check-target-bvqj7" (UID: "560db137-e262-4c6c-9380-c422a8537e5e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:07.666285 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:07.666247 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs\") pod \"network-metrics-daemon-mfs4c\" (UID: \"9c103689-40cc-470b-9109-33a63ff6f5dd\") " pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:07.666467 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:07.666397 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:07.666467 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:07.666457 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs podName:9c103689-40cc-470b-9109-33a63ff6f5dd nodeName:}" failed. No retries permitted until 2026-04-21 01:50:11.666438684 +0000 UTC m=+9.245740726 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs") pod "network-metrics-daemon-mfs4c" (UID: "9c103689-40cc-470b-9109-33a63ff6f5dd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:08.059343 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:08.059228 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:08.059503 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:08.059364 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvqj7" podUID="560db137-e262-4c6c-9380-c422a8537e5e" Apr 21 01:50:09.059907 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:09.059873 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:09.060375 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:09.060023 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mfs4c" podUID="9c103689-40cc-470b-9109-33a63ff6f5dd" Apr 21 01:50:10.059548 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:10.059496 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:10.059728 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:10.059626 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvqj7" podUID="560db137-e262-4c6c-9380-c422a8537e5e" Apr 21 01:50:11.058830 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:11.058774 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:11.059383 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:11.058947 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mfs4c" podUID="9c103689-40cc-470b-9109-33a63ff6f5dd" Apr 21 01:50:11.598239 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:11.598158 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mm6p6\" (UniqueName: \"kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6\") pod \"network-check-target-bvqj7\" (UID: \"560db137-e262-4c6c-9380-c422a8537e5e\") " pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:11.598433 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:11.598342 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 01:50:11.598433 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:11.598370 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 01:50:11.598433 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:11.598384 2573 projected.go:194] Error preparing data for projected volume kube-api-access-mm6p6 for pod openshift-network-diagnostics/network-check-target-bvqj7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:11.598594 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:11.598443 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6 podName:560db137-e262-4c6c-9380-c422a8537e5e nodeName:}" failed. No retries permitted until 2026-04-21 01:50:19.598424287 +0000 UTC m=+17.177726328 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-mm6p6" (UniqueName: "kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6") pod "network-check-target-bvqj7" (UID: "560db137-e262-4c6c-9380-c422a8537e5e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:11.700211 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:11.699573 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs\") pod \"network-metrics-daemon-mfs4c\" (UID: \"9c103689-40cc-470b-9109-33a63ff6f5dd\") " pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:11.700211 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:11.699770 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:11.700211 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:11.699849 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs podName:9c103689-40cc-470b-9109-33a63ff6f5dd nodeName:}" failed. No retries permitted until 2026-04-21 01:50:19.699829759 +0000 UTC m=+17.279131810 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs") pod "network-metrics-daemon-mfs4c" (UID: "9c103689-40cc-470b-9109-33a63ff6f5dd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:12.059267 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:12.059184 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:12.059763 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:12.059319 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvqj7" podUID="560db137-e262-4c6c-9380-c422a8537e5e" Apr 21 01:50:13.060380 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:13.059804 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:13.060380 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:13.059946 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mfs4c" podUID="9c103689-40cc-470b-9109-33a63ff6f5dd" Apr 21 01:50:14.058717 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:14.058674 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:14.058906 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:14.058810 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvqj7" podUID="560db137-e262-4c6c-9380-c422a8537e5e" Apr 21 01:50:15.059292 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:15.059255 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:15.059769 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:15.059391 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mfs4c" podUID="9c103689-40cc-470b-9109-33a63ff6f5dd" Apr 21 01:50:16.059546 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:16.059512 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:16.060025 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:16.059643 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvqj7" podUID="560db137-e262-4c6c-9380-c422a8537e5e" Apr 21 01:50:17.059118 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:17.059074 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:17.059293 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:17.059199 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mfs4c" podUID="9c103689-40cc-470b-9109-33a63ff6f5dd" Apr 21 01:50:18.058834 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:18.058792 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:18.059297 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:18.058936 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvqj7" podUID="560db137-e262-4c6c-9380-c422a8537e5e" Apr 21 01:50:19.058775 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:19.058742 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:19.058953 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:19.058899 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mfs4c" podUID="9c103689-40cc-470b-9109-33a63ff6f5dd" Apr 21 01:50:19.654417 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:19.654357 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mm6p6\" (UniqueName: \"kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6\") pod \"network-check-target-bvqj7\" (UID: \"560db137-e262-4c6c-9380-c422a8537e5e\") " pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:19.654775 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:19.654482 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 01:50:19.654775 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:19.654507 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 01:50:19.654775 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:19.654520 2573 projected.go:194] Error preparing data for projected volume kube-api-access-mm6p6 for pod openshift-network-diagnostics/network-check-target-bvqj7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:19.654775 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:19.654584 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6 podName:560db137-e262-4c6c-9380-c422a8537e5e nodeName:}" failed. No retries permitted until 2026-04-21 01:50:35.654561345 +0000 UTC m=+33.233863404 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-mm6p6" (UniqueName: "kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6") pod "network-check-target-bvqj7" (UID: "560db137-e262-4c6c-9380-c422a8537e5e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:19.755448 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:19.755404 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs\") pod \"network-metrics-daemon-mfs4c\" (UID: \"9c103689-40cc-470b-9109-33a63ff6f5dd\") " pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:19.755634 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:19.755580 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:19.755705 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:19.755658 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs podName:9c103689-40cc-470b-9109-33a63ff6f5dd nodeName:}" failed. No retries permitted until 2026-04-21 01:50:35.755638362 +0000 UTC m=+33.334940410 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs") pod "network-metrics-daemon-mfs4c" (UID: "9c103689-40cc-470b-9109-33a63ff6f5dd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:20.059277 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:20.059177 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:20.059722 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:20.059309 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvqj7" podUID="560db137-e262-4c6c-9380-c422a8537e5e" Apr 21 01:50:20.940999 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:20.940965 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-dxbng"] Apr 21 01:50:20.967907 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:20.967871 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:20.968090 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:20.967979 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxbng" podUID="be58fe9c-7c3d-40c0-9c75-448c5d3d856c" Apr 21 01:50:21.059009 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:21.058975 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:21.059211 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:21.059174 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mfs4c" podUID="9c103689-40cc-470b-9109-33a63ff6f5dd" Apr 21 01:50:21.065933 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:21.065906 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-dbus\") pod \"global-pull-secret-syncer-dxbng\" (UID: \"be58fe9c-7c3d-40c0-9c75-448c5d3d856c\") " pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:21.066341 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:21.065953 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-original-pull-secret\") pod \"global-pull-secret-syncer-dxbng\" (UID: \"be58fe9c-7c3d-40c0-9c75-448c5d3d856c\") " pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:21.066341 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:21.066027 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-kubelet-config\") pod \"global-pull-secret-syncer-dxbng\" (UID: \"be58fe9c-7c3d-40c0-9c75-448c5d3d856c\") " pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:21.167323 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:21.167288 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-dbus\") pod \"global-pull-secret-syncer-dxbng\" (UID: \"be58fe9c-7c3d-40c0-9c75-448c5d3d856c\") " pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:21.167323 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:21.167335 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-original-pull-secret\") pod \"global-pull-secret-syncer-dxbng\" (UID: \"be58fe9c-7c3d-40c0-9c75-448c5d3d856c\") " pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:21.167599 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:21.167368 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-kubelet-config\") pod \"global-pull-secret-syncer-dxbng\" (UID: \"be58fe9c-7c3d-40c0-9c75-448c5d3d856c\") " pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:21.167599 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:21.167465 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-kubelet-config\") pod \"global-pull-secret-syncer-dxbng\" (UID: \"be58fe9c-7c3d-40c0-9c75-448c5d3d856c\") " pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:21.167599 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:21.167496 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-dbus\") pod \"global-pull-secret-syncer-dxbng\" (UID: \"be58fe9c-7c3d-40c0-9c75-448c5d3d856c\") " pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:21.167599 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:21.167475 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 01:50:21.167599 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:21.167571 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-original-pull-secret podName:be58fe9c-7c3d-40c0-9c75-448c5d3d856c nodeName:}" failed. No retries permitted until 2026-04-21 01:50:21.667553566 +0000 UTC m=+19.246855614 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-original-pull-secret") pod "global-pull-secret-syncer-dxbng" (UID: "be58fe9c-7c3d-40c0-9c75-448c5d3d856c") : object "kube-system"/"original-pull-secret" not registered Apr 21 01:50:21.672041 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:21.671998 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-original-pull-secret\") pod \"global-pull-secret-syncer-dxbng\" (UID: \"be58fe9c-7c3d-40c0-9c75-448c5d3d856c\") " pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:21.672232 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:21.672141 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 01:50:21.672232 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:21.672218 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-original-pull-secret podName:be58fe9c-7c3d-40c0-9c75-448c5d3d856c nodeName:}" failed. No retries permitted until 2026-04-21 01:50:22.672200048 +0000 UTC m=+20.251502101 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-original-pull-secret") pod "global-pull-secret-syncer-dxbng" (UID: "be58fe9c-7c3d-40c0-9c75-448c5d3d856c") : object "kube-system"/"original-pull-secret" not registered Apr 21 01:50:22.059447 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:22.059415 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:22.059610 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:22.059415 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:22.059610 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:22.059541 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxbng" podUID="be58fe9c-7c3d-40c0-9c75-448c5d3d856c" Apr 21 01:50:22.059610 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:22.059586 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvqj7" podUID="560db137-e262-4c6c-9380-c422a8537e5e" Apr 21 01:50:22.679148 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:22.678972 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-original-pull-secret\") pod \"global-pull-secret-syncer-dxbng\" (UID: \"be58fe9c-7c3d-40c0-9c75-448c5d3d856c\") " pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:22.679910 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:22.679126 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 01:50:22.679910 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:22.679248 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-original-pull-secret podName:be58fe9c-7c3d-40c0-9c75-448c5d3d856c nodeName:}" failed. No retries permitted until 2026-04-21 01:50:24.679226405 +0000 UTC m=+22.258528443 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-original-pull-secret") pod "global-pull-secret-syncer-dxbng" (UID: "be58fe9c-7c3d-40c0-9c75-448c5d3d856c") : object "kube-system"/"original-pull-secret" not registered Apr 21 01:50:23.060230 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.059993 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:23.060373 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:23.060334 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mfs4c" podUID="9c103689-40cc-470b-9109-33a63ff6f5dd" Apr 21 01:50:23.232810 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.232708 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2tdx_ed377958-ce5b-41c7-9512-4b95b799767d/ovn-acl-logging/0.log" Apr 21 01:50:23.233036 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.233010 2573 generic.go:358] "Generic (PLEG): container finished" podID="ed377958-ce5b-41c7-9512-4b95b799767d" containerID="067b1b81886b79777741cfe7807f28c85119df76162a9d001e1b1fe5496efca0" exitCode=1 Apr 21 01:50:23.233104 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.233080 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" event={"ID":"ed377958-ce5b-41c7-9512-4b95b799767d","Type":"ContainerStarted","Data":"8825ea545d5ce1136551d7a17cd359f7aa23d31b448704b9810f9bc9beebb034"} Apr 21 01:50:23.233147 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.233119 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" event={"ID":"ed377958-ce5b-41c7-9512-4b95b799767d","Type":"ContainerStarted","Data":"1292a64ff38cd6ac9510ecaa202ed36f5a11429fbe43d5e0d2c937fe1a045201"} Apr 21 01:50:23.233147 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.233134 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" event={"ID":"ed377958-ce5b-41c7-9512-4b95b799767d","Type":"ContainerStarted","Data":"ec4ac12c7653975946cf30d77f597bda301d424f8aab75917ad5baa885e83e17"} Apr 21 01:50:23.233229 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.233147 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" event={"ID":"ed377958-ce5b-41c7-9512-4b95b799767d","Type":"ContainerStarted","Data":"2660c6bb6bd704fb431272d452c58719a0ce93eba62373176de2d80362acfd95"} Apr 21 01:50:23.233229 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.233159 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" event={"ID":"ed377958-ce5b-41c7-9512-4b95b799767d","Type":"ContainerDied","Data":"067b1b81886b79777741cfe7807f28c85119df76162a9d001e1b1fe5496efca0"} Apr 21 01:50:23.233229 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.233174 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" event={"ID":"ed377958-ce5b-41c7-9512-4b95b799767d","Type":"ContainerStarted","Data":"77b7a6e2465d499e2231a837a022e0a0f002161670eb45a4a1c7737597126b70"} Apr 21 01:50:23.234296 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.234278 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w4c4r" event={"ID":"cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0","Type":"ContainerStarted","Data":"5bb2135645e7e2b2792faadbbddf7fad88639f3b89964d7b13334e028942902b"} Apr 21 01:50:23.235450 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.235433 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-fdckn" event={"ID":"214bcf67-a154-4c72-a914-d9efa8bdfee9","Type":"ContainerStarted","Data":"6e2de04790e301e0cc2af79906cb791ba00419d0c820780e305a7856eca9e10e"} Apr 21 01:50:23.236678 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.236654 2573 generic.go:358] "Generic (PLEG): container finished" podID="053154a49e4e985b6b9ca90f37194bef" containerID="e0e2fbdbe21499030158d888de4a56ea7a50bc5eabdfaf931b4609f6e6f8c1c3" exitCode=0 Apr 21 01:50:23.236747 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.236719 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-52.ec2.internal" event={"ID":"053154a49e4e985b6b9ca90f37194bef","Type":"ContainerDied","Data":"e0e2fbdbe21499030158d888de4a56ea7a50bc5eabdfaf931b4609f6e6f8c1c3"} Apr 21 01:50:23.238040 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.238022 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" event={"ID":"4d1027fb-9f71-4cf4-b0db-1f2916e50320","Type":"ContainerStarted","Data":"1d37ded2e6478f65813dafafde2783369a1fffcfc4c36a88c8ca34822115f73b"} Apr 21 01:50:23.239283 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.239265 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" event={"ID":"f7e3c291-0094-4e7c-9d88-7f0864c79c8d","Type":"ContainerStarted","Data":"4f9928256327dc3377960cf39bcecc81a86740d26d6e42f85ee73c8de198f05f"} Apr 21 01:50:23.240460 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.240442 2573 generic.go:358] "Generic (PLEG): container finished" podID="7b9f858d-0cf6-49d9-8632-23d2d0584e24" containerID="f97f2567236970be0354a033c4b5769dc881be3db8f8ec3245f49bf1f086b46f" exitCode=0 Apr 21 01:50:23.240517 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.240493 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8h5m5" event={"ID":"7b9f858d-0cf6-49d9-8632-23d2d0584e24","Type":"ContainerDied","Data":"f97f2567236970be0354a033c4b5769dc881be3db8f8ec3245f49bf1f086b46f"} Apr 21 01:50:23.241788 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.241685 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-l94wt" event={"ID":"25059272-61bd-4d87-9141-9036eaa06ce3","Type":"ContainerStarted","Data":"6d8dd874a38066276d1f6af82fcc7fc2714dd0c9711e8e9a193c6f94a9e2c942"} Apr 21 01:50:23.242993 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.242902 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-52.ec2.internal" event={"ID":"bc7f96eee01b7ac864ac98f5fb6e45b4","Type":"ContainerStarted","Data":"d53d7a3a5fef91d212495772f25096c619f4e04c75cc88140dcfe55c792ed8b5"} Apr 21 01:50:23.244139 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.244122 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xfl2x" event={"ID":"6de78603-c646-45a5-8bc4-9cfc56456d0f","Type":"ContainerStarted","Data":"9851f81f9ba77675c8769975dae062cc88ec5a87548fbc68c3326b236a913857"} Apr 21 01:50:23.249130 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.249073 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w4c4r" podStartSLOduration=2.217396527 podStartE2EDuration="20.249059608s" podCreationTimestamp="2026-04-21 01:50:03 +0000 UTC" firstStartedPulling="2026-04-21 01:50:04.065843101 +0000 UTC m=+1.645145141" lastFinishedPulling="2026-04-21 01:50:22.097506176 +0000 UTC m=+19.676808222" observedRunningTime="2026-04-21 01:50:23.248577443 +0000 UTC m=+20.827879502" watchObservedRunningTime="2026-04-21 01:50:23.249059608 +0000 UTC m=+20.828361667" Apr 21 01:50:23.262883 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.262799 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xfl2x" podStartSLOduration=10.509059373 podStartE2EDuration="20.262780367s" podCreationTimestamp="2026-04-21 01:50:03 +0000 UTC" firstStartedPulling="2026-04-21 01:50:04.343223225 +0000 UTC m=+1.922525261" lastFinishedPulling="2026-04-21 01:50:14.096944216 +0000 UTC m=+11.676246255" observedRunningTime="2026-04-21 01:50:23.262737122 +0000 UTC m=+20.842039182" watchObservedRunningTime="2026-04-21 01:50:23.262780367 +0000 UTC m=+20.842082425" Apr 21 01:50:23.282576 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.282523 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-fdckn" podStartSLOduration=2.233927714 podStartE2EDuration="20.282504608s" podCreationTimestamp="2026-04-21 01:50:03 +0000 UTC" firstStartedPulling="2026-04-21 01:50:04.056117302 +0000 UTC m=+1.635419339" lastFinishedPulling="2026-04-21 01:50:22.104694185 +0000 UTC m=+19.683996233" observedRunningTime="2026-04-21 01:50:23.281889574 +0000 UTC m=+20.861191629" watchObservedRunningTime="2026-04-21 01:50:23.282504608 +0000 UTC m=+20.861806667" Apr 21 01:50:23.294211 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.294149 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-l94wt" podStartSLOduration=2.260073506 podStartE2EDuration="20.294130796s" podCreationTimestamp="2026-04-21 01:50:03 +0000 UTC" firstStartedPulling="2026-04-21 01:50:04.063193768 +0000 UTC m=+1.642495804" lastFinishedPulling="2026-04-21 01:50:22.097251051 +0000 UTC m=+19.676553094" observedRunningTime="2026-04-21 01:50:23.294036612 +0000 UTC m=+20.873338673" watchObservedRunningTime="2026-04-21 01:50:23.294130796 +0000 UTC m=+20.873432858" Apr 21 01:50:23.324517 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.324470 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-52.ec2.internal" podStartSLOduration=20.324456152 podStartE2EDuration="20.324456152s" podCreationTimestamp="2026-04-21 01:50:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 01:50:23.324094947 +0000 UTC m=+20.903397018" watchObservedRunningTime="2026-04-21 01:50:23.324456152 +0000 UTC m=+20.903758210" Apr 21 01:50:23.341872 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.341193 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-l9kzx" podStartSLOduration=2.557222443 podStartE2EDuration="20.341171853s" podCreationTimestamp="2026-04-21 01:50:03 +0000 UTC" firstStartedPulling="2026-04-21 01:50:04.316068938 +0000 UTC m=+1.895370976" lastFinishedPulling="2026-04-21 01:50:22.100018336 +0000 UTC m=+19.679320386" observedRunningTime="2026-04-21 01:50:23.340360193 +0000 UTC m=+20.919662252" watchObservedRunningTime="2026-04-21 01:50:23.341171853 +0000 UTC m=+20.920473914" Apr 21 01:50:23.878146 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:23.878122 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 01:50:24.003218 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:24.003044 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T01:50:23.878142081Z","UUID":"535c9d2d-e92f-431d-b9f2-f7b9a83a76bb","Handler":null,"Name":"","Endpoint":""} Apr 21 01:50:24.004888 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:24.004866 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 01:50:24.005023 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:24.004895 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 01:50:24.059154 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:24.059123 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:24.059317 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:24.059123 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:24.059317 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:24.059245 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvqj7" podUID="560db137-e262-4c6c-9380-c422a8537e5e" Apr 21 01:50:24.059419 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:24.059331 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxbng" podUID="be58fe9c-7c3d-40c0-9c75-448c5d3d856c" Apr 21 01:50:24.248454 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:24.248412 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-52.ec2.internal" event={"ID":"053154a49e4e985b6b9ca90f37194bef","Type":"ContainerStarted","Data":"0a6de193ba718d50018db79d00e617e2f586c03593652dbfb11a446093d83f21"} Apr 21 01:50:24.250171 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:24.250141 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" event={"ID":"f7e3c291-0094-4e7c-9d88-7f0864c79c8d","Type":"ContainerStarted","Data":"7819059ff342bf3961013091d9bbe03995eb78a8f96757cefde4fc65a39214cc"} Apr 21 01:50:24.251671 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:24.251633 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2wlcr" event={"ID":"cacbe46e-f843-4ec6-b2e4-222e2ab51feb","Type":"ContainerStarted","Data":"b07a76ceb2983c5b4ffcc111ce4b8073d71d1af757353ff7da184d479f9b3a67"} Apr 21 01:50:24.262235 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:24.262152 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-52.ec2.internal" podStartSLOduration=21.262135158 podStartE2EDuration="21.262135158s" podCreationTimestamp="2026-04-21 01:50:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 01:50:24.262070523 +0000 UTC m=+21.841372581" watchObservedRunningTime="2026-04-21 01:50:24.262135158 +0000 UTC m=+21.841437217" Apr 21 01:50:24.695474 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:24.695422 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-original-pull-secret\") pod \"global-pull-secret-syncer-dxbng\" (UID: \"be58fe9c-7c3d-40c0-9c75-448c5d3d856c\") " pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:24.695717 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:24.695597 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 01:50:24.695717 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:24.695675 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-original-pull-secret podName:be58fe9c-7c3d-40c0-9c75-448c5d3d856c nodeName:}" failed. No retries permitted until 2026-04-21 01:50:28.695655141 +0000 UTC m=+26.274957183 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-original-pull-secret") pod "global-pull-secret-syncer-dxbng" (UID: "be58fe9c-7c3d-40c0-9c75-448c5d3d856c") : object "kube-system"/"original-pull-secret" not registered Apr 21 01:50:25.059213 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:25.059033 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:25.059647 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:25.059332 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mfs4c" podUID="9c103689-40cc-470b-9109-33a63ff6f5dd" Apr 21 01:50:25.224063 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:25.223964 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-l94wt" Apr 21 01:50:25.224700 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:25.224668 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-l94wt" Apr 21 01:50:25.238899 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:25.238836 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2wlcr" podStartSLOduration=4.418441767 podStartE2EDuration="22.238800965s" podCreationTimestamp="2026-04-21 01:50:03 +0000 UTC" firstStartedPulling="2026-04-21 01:50:04.248985291 +0000 UTC m=+1.828287327" lastFinishedPulling="2026-04-21 01:50:22.069344482 +0000 UTC m=+19.648646525" observedRunningTime="2026-04-21 01:50:24.274583411 +0000 UTC m=+21.853885468" watchObservedRunningTime="2026-04-21 01:50:25.238800965 +0000 UTC m=+22.818103024" Apr 21 01:50:25.255916 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:25.255881 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" event={"ID":"f7e3c291-0094-4e7c-9d88-7f0864c79c8d","Type":"ContainerStarted","Data":"12d991d08394cd91566b48c40e328b6f17bb72d940132044d89a457d1ff9e461"} Apr 21 01:50:25.258788 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:25.258747 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2tdx_ed377958-ce5b-41c7-9512-4b95b799767d/ovn-acl-logging/0.log" Apr 21 01:50:25.259581 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:25.259549 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" event={"ID":"ed377958-ce5b-41c7-9512-4b95b799767d","Type":"ContainerStarted","Data":"f39d08bd923eeedbfaa3f196fd5968107e8f3fb0930c0a0c09f781a0ac626a32"} Apr 21 01:50:25.260063 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:25.259932 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-l94wt" Apr 21 01:50:25.260586 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:25.260555 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-l94wt" Apr 21 01:50:25.274039 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:25.273983 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cd5f6" podStartSLOduration=1.84638312 podStartE2EDuration="22.273964279s" podCreationTimestamp="2026-04-21 01:50:03 +0000 UTC" firstStartedPulling="2026-04-21 01:50:04.298192775 +0000 UTC m=+1.877494814" lastFinishedPulling="2026-04-21 01:50:24.725773928 +0000 UTC m=+22.305075973" observedRunningTime="2026-04-21 01:50:25.273843023 +0000 UTC m=+22.853145082" watchObservedRunningTime="2026-04-21 01:50:25.273964279 +0000 UTC m=+22.853266337" Apr 21 01:50:26.058734 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:26.058706 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:26.058734 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:26.058743 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:26.058998 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:26.058836 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvqj7" podUID="560db137-e262-4c6c-9380-c422a8537e5e" Apr 21 01:50:26.058998 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:26.058947 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxbng" podUID="be58fe9c-7c3d-40c0-9c75-448c5d3d856c" Apr 21 01:50:27.059322 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:27.059288 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:27.059895 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:27.059427 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mfs4c" podUID="9c103689-40cc-470b-9109-33a63ff6f5dd" Apr 21 01:50:28.059312 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:28.059134 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:28.059510 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:28.059157 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:28.059510 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:28.059408 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvqj7" podUID="560db137-e262-4c6c-9380-c422a8537e5e" Apr 21 01:50:28.059510 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:28.059468 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxbng" podUID="be58fe9c-7c3d-40c0-9c75-448c5d3d856c" Apr 21 01:50:28.270324 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:28.270289 2573 generic.go:358] "Generic (PLEG): container finished" podID="7b9f858d-0cf6-49d9-8632-23d2d0584e24" containerID="034b7ff27163b5a11e36577ffd2694bb0289179e820827daf813c62bb48516e5" exitCode=0 Apr 21 01:50:28.270504 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:28.270359 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8h5m5" event={"ID":"7b9f858d-0cf6-49d9-8632-23d2d0584e24","Type":"ContainerDied","Data":"034b7ff27163b5a11e36577ffd2694bb0289179e820827daf813c62bb48516e5"} Apr 21 01:50:28.273284 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:28.273264 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2tdx_ed377958-ce5b-41c7-9512-4b95b799767d/ovn-acl-logging/0.log" Apr 21 01:50:28.273608 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:28.273581 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" event={"ID":"ed377958-ce5b-41c7-9512-4b95b799767d","Type":"ContainerStarted","Data":"01e7ef11c9423d957336e006873198aede69d4b48529a6bcbf91d987d9f115e7"} Apr 21 01:50:28.273923 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:28.273900 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:28.274031 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:28.273931 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:28.274083 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:28.274068 2573 scope.go:117] "RemoveContainer" containerID="067b1b81886b79777741cfe7807f28c85119df76162a9d001e1b1fe5496efca0" Apr 21 01:50:28.289780 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:28.289752 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:28.727071 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:28.727038 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-original-pull-secret\") pod \"global-pull-secret-syncer-dxbng\" (UID: \"be58fe9c-7c3d-40c0-9c75-448c5d3d856c\") " pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:28.727270 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:28.727213 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 01:50:28.727334 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:28.727296 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-original-pull-secret podName:be58fe9c-7c3d-40c0-9c75-448c5d3d856c nodeName:}" failed. No retries permitted until 2026-04-21 01:50:36.727279433 +0000 UTC m=+34.306581479 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-original-pull-secret") pod "global-pull-secret-syncer-dxbng" (UID: "be58fe9c-7c3d-40c0-9c75-448c5d3d856c") : object "kube-system"/"original-pull-secret" not registered Apr 21 01:50:29.059444 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:29.059415 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:29.059580 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:29.059562 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mfs4c" podUID="9c103689-40cc-470b-9109-33a63ff6f5dd" Apr 21 01:50:29.278139 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:29.278064 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2tdx_ed377958-ce5b-41c7-9512-4b95b799767d/ovn-acl-logging/0.log" Apr 21 01:50:29.278445 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:29.278417 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" event={"ID":"ed377958-ce5b-41c7-9512-4b95b799767d","Type":"ContainerStarted","Data":"a87950e879363327f3cf9634c2e2860b11cbb9a0696f1d30283ebbfe5025b36f"} Apr 21 01:50:29.278680 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:29.278664 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:29.280293 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:29.280271 2573 generic.go:358] "Generic (PLEG): container finished" podID="7b9f858d-0cf6-49d9-8632-23d2d0584e24" containerID="00faa0006eab45b8349498e6a768e0f530cb412b1abda6efaec6c5cf6467aece" exitCode=0 Apr 21 01:50:29.280423 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:29.280305 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8h5m5" event={"ID":"7b9f858d-0cf6-49d9-8632-23d2d0584e24","Type":"ContainerDied","Data":"00faa0006eab45b8349498e6a768e0f530cb412b1abda6efaec6c5cf6467aece"} Apr 21 01:50:29.295703 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:29.295658 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:50:29.303645 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:29.303615 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bvqj7"] Apr 21 01:50:29.303844 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:29.303762 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:29.303928 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:29.303906 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvqj7" podUID="560db137-e262-4c6c-9380-c422a8537e5e" Apr 21 01:50:29.304558 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:29.304539 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dxbng"] Apr 21 01:50:29.304647 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:29.304636 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:29.304728 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:29.304713 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxbng" podUID="be58fe9c-7c3d-40c0-9c75-448c5d3d856c" Apr 21 01:50:29.305802 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:29.305777 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mfs4c"] Apr 21 01:50:29.305917 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:29.305902 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:29.306028 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:29.306006 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mfs4c" podUID="9c103689-40cc-470b-9109-33a63ff6f5dd" Apr 21 01:50:29.309524 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:29.308777 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" podStartSLOduration=8.266125364 podStartE2EDuration="26.308762468s" podCreationTimestamp="2026-04-21 01:50:03 +0000 UTC" firstStartedPulling="2026-04-21 01:50:04.27756386 +0000 UTC m=+1.856865895" lastFinishedPulling="2026-04-21 01:50:22.32020096 +0000 UTC m=+19.899502999" observedRunningTime="2026-04-21 01:50:29.307672268 +0000 UTC m=+26.886974324" watchObservedRunningTime="2026-04-21 01:50:29.308762468 +0000 UTC m=+26.888064527" Apr 21 01:50:30.284381 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:30.284213 2573 generic.go:358] "Generic (PLEG): container finished" podID="7b9f858d-0cf6-49d9-8632-23d2d0584e24" containerID="6a0b43f9e73317dbe528e45784ac9c0fbc31b7e08e43f22cbb462a9edf6d4252" exitCode=0 Apr 21 01:50:30.284381 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:30.284295 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8h5m5" event={"ID":"7b9f858d-0cf6-49d9-8632-23d2d0584e24","Type":"ContainerDied","Data":"6a0b43f9e73317dbe528e45784ac9c0fbc31b7e08e43f22cbb462a9edf6d4252"} Apr 21 01:50:31.059476 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:31.059438 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:31.059476 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:31.059438 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:31.059712 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:31.059500 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:31.059712 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:31.059585 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mfs4c" podUID="9c103689-40cc-470b-9109-33a63ff6f5dd" Apr 21 01:50:31.059712 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:31.059666 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvqj7" podUID="560db137-e262-4c6c-9380-c422a8537e5e" Apr 21 01:50:31.059872 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:31.059741 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxbng" podUID="be58fe9c-7c3d-40c0-9c75-448c5d3d856c" Apr 21 01:50:33.060362 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:33.060325 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:33.060920 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:33.060442 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mfs4c" podUID="9c103689-40cc-470b-9109-33a63ff6f5dd" Apr 21 01:50:33.060920 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:33.060534 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:33.060920 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:33.060668 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxbng" podUID="be58fe9c-7c3d-40c0-9c75-448c5d3d856c" Apr 21 01:50:33.060920 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:33.060713 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:33.060920 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:33.060869 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvqj7" podUID="560db137-e262-4c6c-9380-c422a8537e5e" Apr 21 01:50:35.059688 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.059650 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:35.060271 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.059649 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:35.060271 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.059660 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:35.060271 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:35.059807 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mfs4c" podUID="9c103689-40cc-470b-9109-33a63ff6f5dd" Apr 21 01:50:35.060271 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:35.059921 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dxbng" podUID="be58fe9c-7c3d-40c0-9c75-448c5d3d856c" Apr 21 01:50:35.060271 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:35.059980 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvqj7" podUID="560db137-e262-4c6c-9380-c422a8537e5e" Apr 21 01:50:35.206522 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.206438 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-52.ec2.internal" event="NodeReady" Apr 21 01:50:35.206685 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.206602 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 01:50:35.251526 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.251492 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-w2qdq"] Apr 21 01:50:35.275411 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.275382 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-r9t8h"] Apr 21 01:50:35.275574 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.275558 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w2qdq" Apr 21 01:50:35.278232 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.278173 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vd9s5\"" Apr 21 01:50:35.278415 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.278397 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 01:50:35.278791 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.278678 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 01:50:35.293952 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.293923 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w2qdq"] Apr 21 01:50:35.294090 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.293957 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r9t8h"] Apr 21 01:50:35.294090 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.294071 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r9t8h" Apr 21 01:50:35.296508 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.296486 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 01:50:35.296643 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.296558 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 01:50:35.296643 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.296599 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbqdn\"" Apr 21 01:50:35.296643 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.296556 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 01:50:35.373207 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.373168 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8598fef8-2cf7-4d82-aa02-44eac46217af-tmp-dir\") pod \"dns-default-w2qdq\" (UID: \"8598fef8-2cf7-4d82-aa02-44eac46217af\") " pod="openshift-dns/dns-default-w2qdq" Apr 21 01:50:35.373394 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.373246 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8598fef8-2cf7-4d82-aa02-44eac46217af-config-volume\") pod \"dns-default-w2qdq\" (UID: \"8598fef8-2cf7-4d82-aa02-44eac46217af\") " pod="openshift-dns/dns-default-w2qdq" Apr 21 01:50:35.373394 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.373271 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert\") pod \"ingress-canary-r9t8h\" (UID: \"ff1d28c8-cbb1-4385-9c36-da62f691590f\") " pod="openshift-ingress-canary/ingress-canary-r9t8h" Apr 21 01:50:35.373394 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.373292 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls\") pod \"dns-default-w2qdq\" (UID: \"8598fef8-2cf7-4d82-aa02-44eac46217af\") " pod="openshift-dns/dns-default-w2qdq" Apr 21 01:50:35.373394 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.373320 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp6vz\" (UniqueName: \"kubernetes.io/projected/8598fef8-2cf7-4d82-aa02-44eac46217af-kube-api-access-tp6vz\") pod \"dns-default-w2qdq\" (UID: \"8598fef8-2cf7-4d82-aa02-44eac46217af\") " pod="openshift-dns/dns-default-w2qdq" Apr 21 01:50:35.373394 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.373375 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fnjt\" (UniqueName: \"kubernetes.io/projected/ff1d28c8-cbb1-4385-9c36-da62f691590f-kube-api-access-8fnjt\") pod \"ingress-canary-r9t8h\" (UID: \"ff1d28c8-cbb1-4385-9c36-da62f691590f\") " pod="openshift-ingress-canary/ingress-canary-r9t8h" Apr 21 01:50:35.474726 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.474638 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8598fef8-2cf7-4d82-aa02-44eac46217af-config-volume\") pod \"dns-default-w2qdq\" (UID: \"8598fef8-2cf7-4d82-aa02-44eac46217af\") " pod="openshift-dns/dns-default-w2qdq" Apr 21 01:50:35.474726 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.474676 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert\") pod \"ingress-canary-r9t8h\" (UID: \"ff1d28c8-cbb1-4385-9c36-da62f691590f\") " pod="openshift-ingress-canary/ingress-canary-r9t8h" Apr 21 01:50:35.474726 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.474708 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls\") pod \"dns-default-w2qdq\" (UID: \"8598fef8-2cf7-4d82-aa02-44eac46217af\") " pod="openshift-dns/dns-default-w2qdq" Apr 21 01:50:35.475025 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.474760 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tp6vz\" (UniqueName: \"kubernetes.io/projected/8598fef8-2cf7-4d82-aa02-44eac46217af-kube-api-access-tp6vz\") pod \"dns-default-w2qdq\" (UID: \"8598fef8-2cf7-4d82-aa02-44eac46217af\") " pod="openshift-dns/dns-default-w2qdq" Apr 21 01:50:35.475025 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:35.474833 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:50:35.475025 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.474843 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fnjt\" (UniqueName: \"kubernetes.io/projected/ff1d28c8-cbb1-4385-9c36-da62f691590f-kube-api-access-8fnjt\") pod \"ingress-canary-r9t8h\" (UID: \"ff1d28c8-cbb1-4385-9c36-da62f691590f\") " pod="openshift-ingress-canary/ingress-canary-r9t8h" Apr 21 01:50:35.475025 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.474871 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8598fef8-2cf7-4d82-aa02-44eac46217af-tmp-dir\") pod \"dns-default-w2qdq\" (UID: \"8598fef8-2cf7-4d82-aa02-44eac46217af\") " pod="openshift-dns/dns-default-w2qdq" Apr 21 01:50:35.475025 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:35.474894 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls podName:8598fef8-2cf7-4d82-aa02-44eac46217af nodeName:}" failed. No retries permitted until 2026-04-21 01:50:35.974875217 +0000 UTC m=+33.554177267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls") pod "dns-default-w2qdq" (UID: "8598fef8-2cf7-4d82-aa02-44eac46217af") : secret "dns-default-metrics-tls" not found Apr 21 01:50:35.475025 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:35.475009 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:50:35.475298 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:35.475047 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert podName:ff1d28c8-cbb1-4385-9c36-da62f691590f nodeName:}" failed. No retries permitted until 2026-04-21 01:50:35.975035485 +0000 UTC m=+33.554337521 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert") pod "ingress-canary-r9t8h" (UID: "ff1d28c8-cbb1-4385-9c36-da62f691590f") : secret "canary-serving-cert" not found Apr 21 01:50:35.475298 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.475187 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8598fef8-2cf7-4d82-aa02-44eac46217af-tmp-dir\") pod \"dns-default-w2qdq\" (UID: \"8598fef8-2cf7-4d82-aa02-44eac46217af\") " pod="openshift-dns/dns-default-w2qdq" Apr 21 01:50:35.475383 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.475331 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8598fef8-2cf7-4d82-aa02-44eac46217af-config-volume\") pod \"dns-default-w2qdq\" (UID: \"8598fef8-2cf7-4d82-aa02-44eac46217af\") " pod="openshift-dns/dns-default-w2qdq" Apr 21 01:50:35.485863 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.485838 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp6vz\" (UniqueName: \"kubernetes.io/projected/8598fef8-2cf7-4d82-aa02-44eac46217af-kube-api-access-tp6vz\") pod \"dns-default-w2qdq\" (UID: \"8598fef8-2cf7-4d82-aa02-44eac46217af\") " pod="openshift-dns/dns-default-w2qdq" Apr 21 01:50:35.486002 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.485980 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fnjt\" (UniqueName: \"kubernetes.io/projected/ff1d28c8-cbb1-4385-9c36-da62f691590f-kube-api-access-8fnjt\") pod \"ingress-canary-r9t8h\" (UID: \"ff1d28c8-cbb1-4385-9c36-da62f691590f\") " pod="openshift-ingress-canary/ingress-canary-r9t8h" Apr 21 01:50:35.677084 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.677051 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mm6p6\" (UniqueName: \"kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6\") pod \"network-check-target-bvqj7\" (UID: \"560db137-e262-4c6c-9380-c422a8537e5e\") " pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:35.677253 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:35.677222 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 01:50:35.677253 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:35.677244 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 01:50:35.677253 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:35.677254 2573 projected.go:194] Error preparing data for projected volume kube-api-access-mm6p6 for pod openshift-network-diagnostics/network-check-target-bvqj7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:35.677355 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:35.677305 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6 podName:560db137-e262-4c6c-9380-c422a8537e5e nodeName:}" failed. No retries permitted until 2026-04-21 01:51:07.67729083 +0000 UTC m=+65.256592866 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-mm6p6" (UniqueName: "kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6") pod "network-check-target-bvqj7" (UID: "560db137-e262-4c6c-9380-c422a8537e5e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 01:50:35.777840 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.777737 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs\") pod \"network-metrics-daemon-mfs4c\" (UID: \"9c103689-40cc-470b-9109-33a63ff6f5dd\") " pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:35.778017 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:35.777922 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:35.778017 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:35.778001 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs podName:9c103689-40cc-470b-9109-33a63ff6f5dd nodeName:}" failed. No retries permitted until 2026-04-21 01:51:07.777979996 +0000 UTC m=+65.357282033 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs") pod "network-metrics-daemon-mfs4c" (UID: "9c103689-40cc-470b-9109-33a63ff6f5dd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 01:50:35.979274 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.979238 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls\") pod \"dns-default-w2qdq\" (UID: \"8598fef8-2cf7-4d82-aa02-44eac46217af\") " pod="openshift-dns/dns-default-w2qdq" Apr 21 01:50:35.979444 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:35.979351 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert\") pod \"ingress-canary-r9t8h\" (UID: \"ff1d28c8-cbb1-4385-9c36-da62f691590f\") " pod="openshift-ingress-canary/ingress-canary-r9t8h" Apr 21 01:50:35.979444 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:35.979394 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:50:35.979444 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:35.979427 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:50:35.979544 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:35.979473 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls podName:8598fef8-2cf7-4d82-aa02-44eac46217af nodeName:}" failed. No retries permitted until 2026-04-21 01:50:36.979445465 +0000 UTC m=+34.558747514 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls") pod "dns-default-w2qdq" (UID: "8598fef8-2cf7-4d82-aa02-44eac46217af") : secret "dns-default-metrics-tls" not found Apr 21 01:50:35.979544 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:35.979487 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert podName:ff1d28c8-cbb1-4385-9c36-da62f691590f nodeName:}" failed. No retries permitted until 2026-04-21 01:50:36.979480958 +0000 UTC m=+34.558782994 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert") pod "ingress-canary-r9t8h" (UID: "ff1d28c8-cbb1-4385-9c36-da62f691590f") : secret "canary-serving-cert" not found Apr 21 01:50:36.299570 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:36.299398 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8h5m5" event={"ID":"7b9f858d-0cf6-49d9-8632-23d2d0584e24","Type":"ContainerStarted","Data":"83cb358620cabcc46bf54fa73bccf9bb364c2a4b6f4fc3b381a8d3fc7b0cad0f"} Apr 21 01:50:36.783504 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:36.783471 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-original-pull-secret\") pod \"global-pull-secret-syncer-dxbng\" (UID: \"be58fe9c-7c3d-40c0-9c75-448c5d3d856c\") " pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:36.783713 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:36.783586 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 01:50:36.783713 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:36.783640 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-original-pull-secret podName:be58fe9c-7c3d-40c0-9c75-448c5d3d856c nodeName:}" failed. No retries permitted until 2026-04-21 01:50:52.783624977 +0000 UTC m=+50.362927013 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-original-pull-secret") pod "global-pull-secret-syncer-dxbng" (UID: "be58fe9c-7c3d-40c0-9c75-448c5d3d856c") : object "kube-system"/"original-pull-secret" not registered Apr 21 01:50:36.984749 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:36.984701 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert\") pod \"ingress-canary-r9t8h\" (UID: \"ff1d28c8-cbb1-4385-9c36-da62f691590f\") " pod="openshift-ingress-canary/ingress-canary-r9t8h" Apr 21 01:50:36.984749 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:36.984749 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls\") pod \"dns-default-w2qdq\" (UID: \"8598fef8-2cf7-4d82-aa02-44eac46217af\") " pod="openshift-dns/dns-default-w2qdq" Apr 21 01:50:36.984957 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:36.984880 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:50:36.984957 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:36.984890 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:50:36.984957 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:36.984947 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert podName:ff1d28c8-cbb1-4385-9c36-da62f691590f nodeName:}" failed. No retries permitted until 2026-04-21 01:50:38.984926862 +0000 UTC m=+36.564228921 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert") pod "ingress-canary-r9t8h" (UID: "ff1d28c8-cbb1-4385-9c36-da62f691590f") : secret "canary-serving-cert" not found Apr 21 01:50:36.985060 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:36.984966 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls podName:8598fef8-2cf7-4d82-aa02-44eac46217af nodeName:}" failed. No retries permitted until 2026-04-21 01:50:38.984958289 +0000 UTC m=+36.564260326 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls") pod "dns-default-w2qdq" (UID: "8598fef8-2cf7-4d82-aa02-44eac46217af") : secret "dns-default-metrics-tls" not found Apr 21 01:50:37.059621 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:37.059543 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:50:37.059769 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:37.059543 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:37.059769 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:37.059546 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:50:37.062410 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:37.062389 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 01:50:37.062568 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:37.062453 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 01:50:37.062568 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:37.062522 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 01:50:37.062709 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:37.062648 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nb98s\"" Apr 21 01:50:37.063563 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:37.063537 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-68xr4\"" Apr 21 01:50:37.063638 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:37.063608 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 01:50:37.303952 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:37.303919 2573 generic.go:358] "Generic (PLEG): container finished" podID="7b9f858d-0cf6-49d9-8632-23d2d0584e24" containerID="83cb358620cabcc46bf54fa73bccf9bb364c2a4b6f4fc3b381a8d3fc7b0cad0f" exitCode=0 Apr 21 01:50:37.304321 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:37.303982 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8h5m5" event={"ID":"7b9f858d-0cf6-49d9-8632-23d2d0584e24","Type":"ContainerDied","Data":"83cb358620cabcc46bf54fa73bccf9bb364c2a4b6f4fc3b381a8d3fc7b0cad0f"} Apr 21 01:50:38.308143 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:38.308112 2573 generic.go:358] "Generic (PLEG): container finished" podID="7b9f858d-0cf6-49d9-8632-23d2d0584e24" containerID="f137b7409778f08c144ded679220a4e2ae2cf2ce67ea8a7397e22a7ef6237472" exitCode=0 Apr 21 01:50:38.308573 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:38.308169 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8h5m5" event={"ID":"7b9f858d-0cf6-49d9-8632-23d2d0584e24","Type":"ContainerDied","Data":"f137b7409778f08c144ded679220a4e2ae2cf2ce67ea8a7397e22a7ef6237472"} Apr 21 01:50:38.998628 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:38.998591 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert\") pod \"ingress-canary-r9t8h\" (UID: \"ff1d28c8-cbb1-4385-9c36-da62f691590f\") " pod="openshift-ingress-canary/ingress-canary-r9t8h" Apr 21 01:50:38.998628 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:38.998630 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls\") pod \"dns-default-w2qdq\" (UID: \"8598fef8-2cf7-4d82-aa02-44eac46217af\") " pod="openshift-dns/dns-default-w2qdq" Apr 21 01:50:38.998831 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:38.998732 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:50:38.998831 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:38.998734 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:50:38.998831 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:38.998781 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls podName:8598fef8-2cf7-4d82-aa02-44eac46217af nodeName:}" failed. No retries permitted until 2026-04-21 01:50:42.998768548 +0000 UTC m=+40.578070584 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls") pod "dns-default-w2qdq" (UID: "8598fef8-2cf7-4d82-aa02-44eac46217af") : secret "dns-default-metrics-tls" not found Apr 21 01:50:38.998831 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:38.998794 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert podName:ff1d28c8-cbb1-4385-9c36-da62f691590f nodeName:}" failed. No retries permitted until 2026-04-21 01:50:42.998788865 +0000 UTC m=+40.578090901 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert") pod "ingress-canary-r9t8h" (UID: "ff1d28c8-cbb1-4385-9c36-da62f691590f") : secret "canary-serving-cert" not found Apr 21 01:50:39.312613 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:39.312581 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8h5m5" event={"ID":"7b9f858d-0cf6-49d9-8632-23d2d0584e24","Type":"ContainerStarted","Data":"ef1e4b4bef27def22f3e8cca5fa17ffffc1848b717c45ab3bf1832475dcd93ee"} Apr 21 01:50:39.334798 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:39.334747 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-8h5m5" podStartSLOduration=4.504853374 podStartE2EDuration="36.334731994s" podCreationTimestamp="2026-04-21 01:50:03 +0000 UTC" firstStartedPulling="2026-04-21 01:50:04.257958441 +0000 UTC m=+1.837260477" lastFinishedPulling="2026-04-21 01:50:36.087837057 +0000 UTC m=+33.667139097" observedRunningTime="2026-04-21 01:50:39.332794619 +0000 UTC m=+36.912096676" watchObservedRunningTime="2026-04-21 01:50:39.334731994 +0000 UTC m=+36.914034052" Apr 21 01:50:43.026335 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:43.026298 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert\") pod \"ingress-canary-r9t8h\" (UID: \"ff1d28c8-cbb1-4385-9c36-da62f691590f\") " pod="openshift-ingress-canary/ingress-canary-r9t8h" Apr 21 01:50:43.026335 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:43.026343 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls\") pod \"dns-default-w2qdq\" (UID: \"8598fef8-2cf7-4d82-aa02-44eac46217af\") " pod="openshift-dns/dns-default-w2qdq" Apr 21 01:50:43.026808 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:43.026436 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:50:43.026808 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:43.026439 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:50:43.026808 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:43.026493 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls podName:8598fef8-2cf7-4d82-aa02-44eac46217af nodeName:}" failed. No retries permitted until 2026-04-21 01:50:51.026476816 +0000 UTC m=+48.605778853 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls") pod "dns-default-w2qdq" (UID: "8598fef8-2cf7-4d82-aa02-44eac46217af") : secret "dns-default-metrics-tls" not found Apr 21 01:50:43.026808 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:43.026509 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert podName:ff1d28c8-cbb1-4385-9c36-da62f691590f nodeName:}" failed. No retries permitted until 2026-04-21 01:50:51.026501574 +0000 UTC m=+48.605803609 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert") pod "ingress-canary-r9t8h" (UID: "ff1d28c8-cbb1-4385-9c36-da62f691590f") : secret "canary-serving-cert" not found Apr 21 01:50:51.081643 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:51.081604 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert\") pod \"ingress-canary-r9t8h\" (UID: \"ff1d28c8-cbb1-4385-9c36-da62f691590f\") " pod="openshift-ingress-canary/ingress-canary-r9t8h" Apr 21 01:50:51.081643 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:51.081647 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls\") pod \"dns-default-w2qdq\" (UID: \"8598fef8-2cf7-4d82-aa02-44eac46217af\") " pod="openshift-dns/dns-default-w2qdq" Apr 21 01:50:51.082126 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:51.081743 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:50:51.082126 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:51.081746 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:50:51.082126 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:51.081798 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert podName:ff1d28c8-cbb1-4385-9c36-da62f691590f nodeName:}" failed. No retries permitted until 2026-04-21 01:51:07.08178253 +0000 UTC m=+64.661084572 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert") pod "ingress-canary-r9t8h" (UID: "ff1d28c8-cbb1-4385-9c36-da62f691590f") : secret "canary-serving-cert" not found Apr 21 01:50:51.082126 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:50:51.081830 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls podName:8598fef8-2cf7-4d82-aa02-44eac46217af nodeName:}" failed. No retries permitted until 2026-04-21 01:51:07.081805414 +0000 UTC m=+64.661107450 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls") pod "dns-default-w2qdq" (UID: "8598fef8-2cf7-4d82-aa02-44eac46217af") : secret "dns-default-metrics-tls" not found Apr 21 01:50:52.794147 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:52.794105 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-original-pull-secret\") pod \"global-pull-secret-syncer-dxbng\" (UID: \"be58fe9c-7c3d-40c0-9c75-448c5d3d856c\") " pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:52.797161 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:52.797134 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/be58fe9c-7c3d-40c0-9c75-448c5d3d856c-original-pull-secret\") pod \"global-pull-secret-syncer-dxbng\" (UID: \"be58fe9c-7c3d-40c0-9c75-448c5d3d856c\") " pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:52.975547 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:52.975501 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dxbng" Apr 21 01:50:53.102144 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:53.102110 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dxbng"] Apr 21 01:50:53.339563 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:53.339475 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dxbng" event={"ID":"be58fe9c-7c3d-40c0-9c75-448c5d3d856c","Type":"ContainerStarted","Data":"23299d0c848c8e0c7db64f552dd10e1fc665452273ea8b38ade874fc0bc19827"} Apr 21 01:50:57.348890 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:57.348851 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dxbng" event={"ID":"be58fe9c-7c3d-40c0-9c75-448c5d3d856c","Type":"ContainerStarted","Data":"3eb5bd4cbb0f5f0ffa7ca87843af0a8773521f6bdf591e25fc38b855f221d51c"} Apr 21 01:50:57.362852 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:50:57.362782 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-dxbng" podStartSLOduration=33.631076706 podStartE2EDuration="37.362767225s" podCreationTimestamp="2026-04-21 01:50:20 +0000 UTC" firstStartedPulling="2026-04-21 01:50:53.10716743 +0000 UTC m=+50.686469470" lastFinishedPulling="2026-04-21 01:50:56.838857943 +0000 UTC m=+54.418159989" observedRunningTime="2026-04-21 01:50:57.362149348 +0000 UTC m=+54.941451408" watchObservedRunningTime="2026-04-21 01:50:57.362767225 +0000 UTC m=+54.942069282" Apr 21 01:51:01.296934 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:01.296907 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f2tdx" Apr 21 01:51:07.098406 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:07.098359 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert\") pod \"ingress-canary-r9t8h\" (UID: \"ff1d28c8-cbb1-4385-9c36-da62f691590f\") " pod="openshift-ingress-canary/ingress-canary-r9t8h" Apr 21 01:51:07.098406 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:07.098408 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls\") pod \"dns-default-w2qdq\" (UID: \"8598fef8-2cf7-4d82-aa02-44eac46217af\") " pod="openshift-dns/dns-default-w2qdq" Apr 21 01:51:07.098914 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:07.098508 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:51:07.098914 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:07.098557 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:51:07.098914 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:07.098583 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert podName:ff1d28c8-cbb1-4385-9c36-da62f691590f nodeName:}" failed. No retries permitted until 2026-04-21 01:51:39.098567911 +0000 UTC m=+96.677869947 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert") pod "ingress-canary-r9t8h" (UID: "ff1d28c8-cbb1-4385-9c36-da62f691590f") : secret "canary-serving-cert" not found Apr 21 01:51:07.098914 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:07.098616 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls podName:8598fef8-2cf7-4d82-aa02-44eac46217af nodeName:}" failed. No retries permitted until 2026-04-21 01:51:39.098600623 +0000 UTC m=+96.677902672 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls") pod "dns-default-w2qdq" (UID: "8598fef8-2cf7-4d82-aa02-44eac46217af") : secret "dns-default-metrics-tls" not found Apr 21 01:51:07.702782 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:07.702744 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mm6p6\" (UniqueName: \"kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6\") pod \"network-check-target-bvqj7\" (UID: \"560db137-e262-4c6c-9380-c422a8537e5e\") " pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:51:07.705438 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:07.705413 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 01:51:07.715221 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:07.715189 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 01:51:07.726475 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:07.726445 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm6p6\" (UniqueName: \"kubernetes.io/projected/560db137-e262-4c6c-9380-c422a8537e5e-kube-api-access-mm6p6\") pod \"network-check-target-bvqj7\" (UID: \"560db137-e262-4c6c-9380-c422a8537e5e\") " pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:51:07.803583 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:07.803544 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs\") pod \"network-metrics-daemon-mfs4c\" (UID: \"9c103689-40cc-470b-9109-33a63ff6f5dd\") " pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:51:07.806756 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:07.806730 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 01:51:07.813763 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:07.813738 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 01:51:07.813907 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:07.813852 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs podName:9c103689-40cc-470b-9109-33a63ff6f5dd nodeName:}" failed. No retries permitted until 2026-04-21 01:52:11.813830333 +0000 UTC m=+129.393132382 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs") pod "network-metrics-daemon-mfs4c" (UID: "9c103689-40cc-470b-9109-33a63ff6f5dd") : secret "metrics-daemon-secret" not found Apr 21 01:51:07.972128 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:07.972046 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nb98s\"" Apr 21 01:51:07.980008 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:07.979987 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:51:08.089874 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:08.089845 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bvqj7"] Apr 21 01:51:08.093230 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:51:08.093189 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod560db137_e262_4c6c_9380_c422a8537e5e.slice/crio-f7db8df6e9a8115ce1d99da83c6240a31593f099865aa20a249404c109e8e763 WatchSource:0}: Error finding container f7db8df6e9a8115ce1d99da83c6240a31593f099865aa20a249404c109e8e763: Status 404 returned error can't find the container with id f7db8df6e9a8115ce1d99da83c6240a31593f099865aa20a249404c109e8e763 Apr 21 01:51:08.372786 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:08.372746 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bvqj7" event={"ID":"560db137-e262-4c6c-9380-c422a8537e5e","Type":"ContainerStarted","Data":"f7db8df6e9a8115ce1d99da83c6240a31593f099865aa20a249404c109e8e763"} Apr 21 01:51:11.380017 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:11.379985 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bvqj7" event={"ID":"560db137-e262-4c6c-9380-c422a8537e5e","Type":"ContainerStarted","Data":"6c48f964d9bd6468b26bd980d2b19915e552ffe299c313307c94bef78c60ff85"} Apr 21 01:51:11.380441 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:11.380122 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:51:11.393972 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:11.393925 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bvqj7" podStartSLOduration=65.851174606 podStartE2EDuration="1m8.393911114s" podCreationTimestamp="2026-04-21 01:50:03 +0000 UTC" firstStartedPulling="2026-04-21 01:51:08.095157758 +0000 UTC m=+65.674459794" lastFinishedPulling="2026-04-21 01:51:10.637894263 +0000 UTC m=+68.217196302" observedRunningTime="2026-04-21 01:51:11.393236154 +0000 UTC m=+68.972538214" watchObservedRunningTime="2026-04-21 01:51:11.393911114 +0000 UTC m=+68.973213173" Apr 21 01:51:32.362932 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.362896 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-65f9585684-glq78"] Apr 21 01:51:32.367122 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.367105 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:32.369772 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.369748 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 21 01:51:32.369772 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.369765 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 01:51:32.369988 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.369755 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 21 01:51:32.369988 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.369878 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 01:51:32.369988 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.369911 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 21 01:51:32.369988 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.369920 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 21 01:51:32.370167 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.370112 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-9t7wd\"" Apr 21 01:51:32.374059 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.374034 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-65f9585684-glq78"] Apr 21 01:51:32.468025 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.467996 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-56bfd878b7-v67s6"] Apr 21 01:51:32.471061 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.471045 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.473231 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.473209 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 01:51:32.473524 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.473505 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-hdqjq\"" Apr 21 01:51:32.473609 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.473505 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 01:51:32.474141 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.474120 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 01:51:32.475805 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.475783 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-metrics-certs\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:32.475906 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.475893 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70601297-057d-42c8-bef0-315d6797ccfd-service-ca-bundle\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:32.475963 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.475924 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-stats-auth\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:32.475963 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.475954 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-default-certificate\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:32.476066 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.476011 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mcmg\" (UniqueName: \"kubernetes.io/projected/70601297-057d-42c8-bef0-315d6797ccfd-kube-api-access-6mcmg\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:32.479381 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.479356 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 01:51:32.485538 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.485497 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-56bfd878b7-v67s6"] Apr 21 01:51:32.577114 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.577085 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70601297-057d-42c8-bef0-315d6797ccfd-service-ca-bundle\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:32.577114 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.577115 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-stats-auth\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:32.577338 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.577143 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-tls\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.577338 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.577172 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-default-certificate\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:32.577338 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.577200 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-bound-sa-token\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.577338 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.577237 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mcmg\" (UniqueName: \"kubernetes.io/projected/70601297-057d-42c8-bef0-315d6797ccfd-kube-api-access-6mcmg\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:32.577338 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.577262 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0da3286b-0bd1-4daa-bf87-f363e2dc995d-installation-pull-secrets\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.577338 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:32.577308 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/70601297-057d-42c8-bef0-315d6797ccfd-service-ca-bundle podName:70601297-057d-42c8-bef0-315d6797ccfd nodeName:}" failed. No retries permitted until 2026-04-21 01:51:33.077289203 +0000 UTC m=+90.656591251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/70601297-057d-42c8-bef0-315d6797ccfd-service-ca-bundle") pod "router-default-65f9585684-glq78" (UID: "70601297-057d-42c8-bef0-315d6797ccfd") : configmap references non-existent config key: service-ca.crt Apr 21 01:51:32.577686 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.577339 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0da3286b-0bd1-4daa-bf87-f363e2dc995d-image-registry-private-configuration\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.577686 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.577374 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0da3286b-0bd1-4daa-bf87-f363e2dc995d-trusted-ca\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.577686 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.577415 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2thm\" (UniqueName: \"kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-kube-api-access-h2thm\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.577686 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.577442 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-certificates\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.577686 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.577483 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-metrics-certs\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:32.577686 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.577519 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0da3286b-0bd1-4daa-bf87-f363e2dc995d-ca-trust-extracted\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.577686 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:32.577592 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 01:51:32.577686 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:32.577633 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-metrics-certs podName:70601297-057d-42c8-bef0-315d6797ccfd nodeName:}" failed. No retries permitted until 2026-04-21 01:51:33.077620137 +0000 UTC m=+90.656922178 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-metrics-certs") pod "router-default-65f9585684-glq78" (UID: "70601297-057d-42c8-bef0-315d6797ccfd") : secret "router-metrics-certs-default" not found Apr 21 01:51:32.580109 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.580081 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-stats-auth\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:32.580217 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.580132 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-default-certificate\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:32.585155 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.585135 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mcmg\" (UniqueName: \"kubernetes.io/projected/70601297-057d-42c8-bef0-315d6797ccfd-kube-api-access-6mcmg\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:32.678709 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.678668 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-tls\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.678949 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.678730 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-bound-sa-token\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.678949 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.678767 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0da3286b-0bd1-4daa-bf87-f363e2dc995d-installation-pull-secrets\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.678949 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.678804 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0da3286b-0bd1-4daa-bf87-f363e2dc995d-image-registry-private-configuration\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.678949 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:32.678838 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 01:51:32.678949 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.678851 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0da3286b-0bd1-4daa-bf87-f363e2dc995d-trusted-ca\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.678949 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:32.678865 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56bfd878b7-v67s6: secret "image-registry-tls" not found Apr 21 01:51:32.678949 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.678876 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2thm\" (UniqueName: \"kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-kube-api-access-h2thm\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.678949 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.678910 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-certificates\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.678949 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:32.678947 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-tls podName:0da3286b-0bd1-4daa-bf87-f363e2dc995d nodeName:}" failed. No retries permitted until 2026-04-21 01:51:33.178928377 +0000 UTC m=+90.758230418 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-tls") pod "image-registry-56bfd878b7-v67s6" (UID: "0da3286b-0bd1-4daa-bf87-f363e2dc995d") : secret "image-registry-tls" not found Apr 21 01:51:32.679388 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.679002 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0da3286b-0bd1-4daa-bf87-f363e2dc995d-ca-trust-extracted\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.679579 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.679526 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0da3286b-0bd1-4daa-bf87-f363e2dc995d-ca-trust-extracted\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.679702 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.679640 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-certificates\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.679985 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.679964 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0da3286b-0bd1-4daa-bf87-f363e2dc995d-trusted-ca\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.681253 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.681234 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0da3286b-0bd1-4daa-bf87-f363e2dc995d-image-registry-private-configuration\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.681439 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.681424 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0da3286b-0bd1-4daa-bf87-f363e2dc995d-installation-pull-secrets\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.687074 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.687048 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-bound-sa-token\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:32.687419 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:32.687404 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2thm\" (UniqueName: \"kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-kube-api-access-h2thm\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:33.081647 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:33.081557 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-metrics-certs\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:33.081647 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:33.081621 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70601297-057d-42c8-bef0-315d6797ccfd-service-ca-bundle\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:33.081904 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:33.081698 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 01:51:33.081904 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:33.081752 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/70601297-057d-42c8-bef0-315d6797ccfd-service-ca-bundle podName:70601297-057d-42c8-bef0-315d6797ccfd nodeName:}" failed. No retries permitted until 2026-04-21 01:51:34.0817357 +0000 UTC m=+91.661037739 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/70601297-057d-42c8-bef0-315d6797ccfd-service-ca-bundle") pod "router-default-65f9585684-glq78" (UID: "70601297-057d-42c8-bef0-315d6797ccfd") : configmap references non-existent config key: service-ca.crt Apr 21 01:51:33.081904 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:33.081769 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-metrics-certs podName:70601297-057d-42c8-bef0-315d6797ccfd nodeName:}" failed. No retries permitted until 2026-04-21 01:51:34.081762175 +0000 UTC m=+91.661064212 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-metrics-certs") pod "router-default-65f9585684-glq78" (UID: "70601297-057d-42c8-bef0-315d6797ccfd") : secret "router-metrics-certs-default" not found Apr 21 01:51:33.182227 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:33.182170 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-tls\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:33.182405 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:33.182332 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 01:51:33.182405 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:33.182350 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56bfd878b7-v67s6: secret "image-registry-tls" not found Apr 21 01:51:33.182405 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:33.182403 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-tls podName:0da3286b-0bd1-4daa-bf87-f363e2dc995d nodeName:}" failed. No retries permitted until 2026-04-21 01:51:34.182386515 +0000 UTC m=+91.761688568 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-tls") pod "image-registry-56bfd878b7-v67s6" (UID: "0da3286b-0bd1-4daa-bf87-f363e2dc995d") : secret "image-registry-tls" not found Apr 21 01:51:34.089069 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:34.089013 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70601297-057d-42c8-bef0-315d6797ccfd-service-ca-bundle\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:34.089480 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:34.089180 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/70601297-057d-42c8-bef0-315d6797ccfd-service-ca-bundle podName:70601297-057d-42c8-bef0-315d6797ccfd nodeName:}" failed. No retries permitted until 2026-04-21 01:51:36.089162123 +0000 UTC m=+93.668464159 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/70601297-057d-42c8-bef0-315d6797ccfd-service-ca-bundle") pod "router-default-65f9585684-glq78" (UID: "70601297-057d-42c8-bef0-315d6797ccfd") : configmap references non-existent config key: service-ca.crt Apr 21 01:51:34.089480 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:34.089227 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-metrics-certs\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:34.089480 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:34.089309 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 01:51:34.089480 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:34.089339 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-metrics-certs podName:70601297-057d-42c8-bef0-315d6797ccfd nodeName:}" failed. No retries permitted until 2026-04-21 01:51:36.089331625 +0000 UTC m=+93.668633661 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-metrics-certs") pod "router-default-65f9585684-glq78" (UID: "70601297-057d-42c8-bef0-315d6797ccfd") : secret "router-metrics-certs-default" not found Apr 21 01:51:34.190503 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:34.190465 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-tls\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:34.190668 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:34.190595 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 01:51:34.190668 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:34.190607 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56bfd878b7-v67s6: secret "image-registry-tls" not found Apr 21 01:51:34.190668 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:34.190654 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-tls podName:0da3286b-0bd1-4daa-bf87-f363e2dc995d nodeName:}" failed. No retries permitted until 2026-04-21 01:51:36.190641229 +0000 UTC m=+93.769943266 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-tls") pod "image-registry-56bfd878b7-v67s6" (UID: "0da3286b-0bd1-4daa-bf87-f363e2dc995d") : secret "image-registry-tls" not found Apr 21 01:51:36.102634 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:36.102580 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-metrics-certs\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:36.102634 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:36.102652 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70601297-057d-42c8-bef0-315d6797ccfd-service-ca-bundle\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:36.103170 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:36.102756 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 01:51:36.103170 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:36.102766 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/70601297-057d-42c8-bef0-315d6797ccfd-service-ca-bundle podName:70601297-057d-42c8-bef0-315d6797ccfd nodeName:}" failed. No retries permitted until 2026-04-21 01:51:40.102750518 +0000 UTC m=+97.682052558 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/70601297-057d-42c8-bef0-315d6797ccfd-service-ca-bundle") pod "router-default-65f9585684-glq78" (UID: "70601297-057d-42c8-bef0-315d6797ccfd") : configmap references non-existent config key: service-ca.crt Apr 21 01:51:36.103170 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:36.102861 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-metrics-certs podName:70601297-057d-42c8-bef0-315d6797ccfd nodeName:}" failed. No retries permitted until 2026-04-21 01:51:40.102842996 +0000 UTC m=+97.682145031 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-metrics-certs") pod "router-default-65f9585684-glq78" (UID: "70601297-057d-42c8-bef0-315d6797ccfd") : secret "router-metrics-certs-default" not found Apr 21 01:51:36.203121 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:36.203078 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-tls\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:36.203274 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:36.203185 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 01:51:36.203274 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:36.203206 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56bfd878b7-v67s6: secret "image-registry-tls" not found Apr 21 01:51:36.203274 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:36.203256 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-tls podName:0da3286b-0bd1-4daa-bf87-f363e2dc995d nodeName:}" failed. No retries permitted until 2026-04-21 01:51:40.203241828 +0000 UTC m=+97.782543863 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-tls") pod "image-registry-56bfd878b7-v67s6" (UID: "0da3286b-0bd1-4daa-bf87-f363e2dc995d") : secret "image-registry-tls" not found Apr 21 01:51:39.126371 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:39.126332 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert\") pod \"ingress-canary-r9t8h\" (UID: \"ff1d28c8-cbb1-4385-9c36-da62f691590f\") " pod="openshift-ingress-canary/ingress-canary-r9t8h" Apr 21 01:51:39.126877 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:39.126379 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls\") pod \"dns-default-w2qdq\" (UID: \"8598fef8-2cf7-4d82-aa02-44eac46217af\") " pod="openshift-dns/dns-default-w2qdq" Apr 21 01:51:39.126877 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:39.126481 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 01:51:39.126877 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:39.126497 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 01:51:39.126877 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:39.126542 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls podName:8598fef8-2cf7-4d82-aa02-44eac46217af nodeName:}" failed. No retries permitted until 2026-04-21 01:52:43.126527655 +0000 UTC m=+160.705829690 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls") pod "dns-default-w2qdq" (UID: "8598fef8-2cf7-4d82-aa02-44eac46217af") : secret "dns-default-metrics-tls" not found Apr 21 01:51:39.126877 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:39.126579 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert podName:ff1d28c8-cbb1-4385-9c36-da62f691590f nodeName:}" failed. No retries permitted until 2026-04-21 01:52:43.126559028 +0000 UTC m=+160.705861084 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert") pod "ingress-canary-r9t8h" (UID: "ff1d28c8-cbb1-4385-9c36-da62f691590f") : secret "canary-serving-cert" not found Apr 21 01:51:40.134020 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:40.133977 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70601297-057d-42c8-bef0-315d6797ccfd-service-ca-bundle\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:40.134483 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:40.134081 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-metrics-certs\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:40.134483 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:40.134162 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/70601297-057d-42c8-bef0-315d6797ccfd-service-ca-bundle podName:70601297-057d-42c8-bef0-315d6797ccfd nodeName:}" failed. No retries permitted until 2026-04-21 01:51:48.13414386 +0000 UTC m=+105.713445895 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/70601297-057d-42c8-bef0-315d6797ccfd-service-ca-bundle") pod "router-default-65f9585684-glq78" (UID: "70601297-057d-42c8-bef0-315d6797ccfd") : configmap references non-existent config key: service-ca.crt Apr 21 01:51:40.134483 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:40.134213 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 01:51:40.134483 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:40.134280 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-metrics-certs podName:70601297-057d-42c8-bef0-315d6797ccfd nodeName:}" failed. No retries permitted until 2026-04-21 01:51:48.134261636 +0000 UTC m=+105.713563690 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-metrics-certs") pod "router-default-65f9585684-glq78" (UID: "70601297-057d-42c8-bef0-315d6797ccfd") : secret "router-metrics-certs-default" not found Apr 21 01:51:40.234563 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:40.234522 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-tls\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:40.234733 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:40.234676 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 01:51:40.234733 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:40.234700 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56bfd878b7-v67s6: secret "image-registry-tls" not found Apr 21 01:51:40.234853 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:40.234758 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-tls podName:0da3286b-0bd1-4daa-bf87-f363e2dc995d nodeName:}" failed. No retries permitted until 2026-04-21 01:51:48.234740842 +0000 UTC m=+105.814042877 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-tls") pod "image-registry-56bfd878b7-v67s6" (UID: "0da3286b-0bd1-4daa-bf87-f363e2dc995d") : secret "image-registry-tls" not found Apr 21 01:51:41.298511 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:41.298471 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-bf7vv"] Apr 21 01:51:41.304027 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:41.304007 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-bf7vv"] Apr 21 01:51:41.304151 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:41.304102 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-bf7vv" Apr 21 01:51:41.306562 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:41.306541 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 21 01:51:41.306701 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:41.306539 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 21 01:51:41.306701 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:41.306540 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 21 01:51:41.306701 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:41.306577 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 21 01:51:41.307636 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:41.307618 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-9xgtn\"" Apr 21 01:51:41.342528 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:41.342490 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/58f34897-6aad-4803-b675-267fd38e9030-signing-key\") pod \"service-ca-865cb79987-bf7vv\" (UID: \"58f34897-6aad-4803-b675-267fd38e9030\") " pod="openshift-service-ca/service-ca-865cb79987-bf7vv" Apr 21 01:51:41.342684 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:41.342570 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/58f34897-6aad-4803-b675-267fd38e9030-signing-cabundle\") pod \"service-ca-865cb79987-bf7vv\" (UID: \"58f34897-6aad-4803-b675-267fd38e9030\") " pod="openshift-service-ca/service-ca-865cb79987-bf7vv" Apr 21 01:51:41.342684 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:41.342598 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpdb2\" (UniqueName: \"kubernetes.io/projected/58f34897-6aad-4803-b675-267fd38e9030-kube-api-access-wpdb2\") pod \"service-ca-865cb79987-bf7vv\" (UID: \"58f34897-6aad-4803-b675-267fd38e9030\") " pod="openshift-service-ca/service-ca-865cb79987-bf7vv" Apr 21 01:51:41.443509 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:41.443475 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/58f34897-6aad-4803-b675-267fd38e9030-signing-cabundle\") pod \"service-ca-865cb79987-bf7vv\" (UID: \"58f34897-6aad-4803-b675-267fd38e9030\") " pod="openshift-service-ca/service-ca-865cb79987-bf7vv" Apr 21 01:51:41.443509 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:41.443516 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpdb2\" (UniqueName: \"kubernetes.io/projected/58f34897-6aad-4803-b675-267fd38e9030-kube-api-access-wpdb2\") pod \"service-ca-865cb79987-bf7vv\" (UID: \"58f34897-6aad-4803-b675-267fd38e9030\") " pod="openshift-service-ca/service-ca-865cb79987-bf7vv" Apr 21 01:51:41.443727 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:41.443590 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/58f34897-6aad-4803-b675-267fd38e9030-signing-key\") pod \"service-ca-865cb79987-bf7vv\" (UID: \"58f34897-6aad-4803-b675-267fd38e9030\") " pod="openshift-service-ca/service-ca-865cb79987-bf7vv" Apr 21 01:51:41.444165 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:41.444143 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/58f34897-6aad-4803-b675-267fd38e9030-signing-cabundle\") pod \"service-ca-865cb79987-bf7vv\" (UID: \"58f34897-6aad-4803-b675-267fd38e9030\") " pod="openshift-service-ca/service-ca-865cb79987-bf7vv" Apr 21 01:51:41.446047 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:41.446028 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/58f34897-6aad-4803-b675-267fd38e9030-signing-key\") pod \"service-ca-865cb79987-bf7vv\" (UID: \"58f34897-6aad-4803-b675-267fd38e9030\") " pod="openshift-service-ca/service-ca-865cb79987-bf7vv" Apr 21 01:51:41.451278 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:41.451251 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpdb2\" (UniqueName: \"kubernetes.io/projected/58f34897-6aad-4803-b675-267fd38e9030-kube-api-access-wpdb2\") pod \"service-ca-865cb79987-bf7vv\" (UID: \"58f34897-6aad-4803-b675-267fd38e9030\") " pod="openshift-service-ca/service-ca-865cb79987-bf7vv" Apr 21 01:51:41.612986 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:41.612900 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-bf7vv" Apr 21 01:51:41.731770 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:41.731741 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-bf7vv"] Apr 21 01:51:41.735340 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:51:41.735307 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58f34897_6aad_4803_b675_267fd38e9030.slice/crio-88368964a3cc3090989db3c08e069c3ef49ffffc1c880db4d1b1b1aa3cbaea2b WatchSource:0}: Error finding container 88368964a3cc3090989db3c08e069c3ef49ffffc1c880db4d1b1b1aa3cbaea2b: Status 404 returned error can't find the container with id 88368964a3cc3090989db3c08e069c3ef49ffffc1c880db4d1b1b1aa3cbaea2b Apr 21 01:51:41.775493 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:41.775470 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w4c4r_cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0/dns-node-resolver/0.log" Apr 21 01:51:42.384962 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:42.384927 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bvqj7" Apr 21 01:51:42.440266 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:42.440216 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-bf7vv" event={"ID":"58f34897-6aad-4803-b675-267fd38e9030","Type":"ContainerStarted","Data":"88368964a3cc3090989db3c08e069c3ef49ffffc1c880db4d1b1b1aa3cbaea2b"} Apr 21 01:51:42.775391 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:42.775317 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xfl2x_6de78603-c646-45a5-8bc4-9cfc56456d0f/node-ca/0.log" Apr 21 01:51:44.444521 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:44.444481 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-bf7vv" event={"ID":"58f34897-6aad-4803-b675-267fd38e9030","Type":"ContainerStarted","Data":"cdb0904708816e2a10d247e34303b446688242323bd43a4f819a9e56f004c06e"} Apr 21 01:51:44.461583 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:44.461538 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-bf7vv" podStartSLOduration=1.748675677 podStartE2EDuration="3.461525331s" podCreationTimestamp="2026-04-21 01:51:41 +0000 UTC" firstStartedPulling="2026-04-21 01:51:41.737126144 +0000 UTC m=+99.316428180" lastFinishedPulling="2026-04-21 01:51:43.449975798 +0000 UTC m=+101.029277834" observedRunningTime="2026-04-21 01:51:44.460325406 +0000 UTC m=+102.039627464" watchObservedRunningTime="2026-04-21 01:51:44.461525331 +0000 UTC m=+102.040827388" Apr 21 01:51:48.197480 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:48.197443 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70601297-057d-42c8-bef0-315d6797ccfd-service-ca-bundle\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:48.197885 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:48.197595 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-metrics-certs\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:51:48.197885 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:48.197613 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/70601297-057d-42c8-bef0-315d6797ccfd-service-ca-bundle podName:70601297-057d-42c8-bef0-315d6797ccfd nodeName:}" failed. No retries permitted until 2026-04-21 01:52:04.197591162 +0000 UTC m=+121.776893202 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/70601297-057d-42c8-bef0-315d6797ccfd-service-ca-bundle") pod "router-default-65f9585684-glq78" (UID: "70601297-057d-42c8-bef0-315d6797ccfd") : configmap references non-existent config key: service-ca.crt Apr 21 01:51:48.197885 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:48.197662 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 01:51:48.197885 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:48.197698 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-metrics-certs podName:70601297-057d-42c8-bef0-315d6797ccfd nodeName:}" failed. No retries permitted until 2026-04-21 01:52:04.197688399 +0000 UTC m=+121.776990434 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-metrics-certs") pod "router-default-65f9585684-glq78" (UID: "70601297-057d-42c8-bef0-315d6797ccfd") : secret "router-metrics-certs-default" not found Apr 21 01:51:48.298617 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:51:48.298580 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-tls\") pod \"image-registry-56bfd878b7-v67s6\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:51:48.298793 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:48.298729 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 01:51:48.298793 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:48.298746 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56bfd878b7-v67s6: secret "image-registry-tls" not found Apr 21 01:51:48.298908 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:51:48.298801 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-tls podName:0da3286b-0bd1-4daa-bf87-f363e2dc995d nodeName:}" failed. No retries permitted until 2026-04-21 01:52:04.298784749 +0000 UTC m=+121.878086791 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-tls") pod "image-registry-56bfd878b7-v67s6" (UID: "0da3286b-0bd1-4daa-bf87-f363e2dc995d") : secret "image-registry-tls" not found Apr 21 01:52:00.615224 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.615187 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6ghj7"] Apr 21 01:52:00.618483 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.618456 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6ghj7" Apr 21 01:52:00.621512 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.621489 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-q47kz\"" Apr 21 01:52:00.621512 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.621504 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 01:52:00.622565 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.622544 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 01:52:00.622565 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.622559 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 01:52:00.622748 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.622544 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 01:52:00.629200 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.629176 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6ghj7"] Apr 21 01:52:00.677487 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.677443 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-56bfd878b7-v67s6"] Apr 21 01:52:00.678839 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:52:00.677834 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" podUID="0da3286b-0bd1-4daa-bf87-f363e2dc995d" Apr 21 01:52:00.695683 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.695644 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2771f995-2e2b-48cb-a750-37d375badbcd-crio-socket\") pod \"insights-runtime-extractor-6ghj7\" (UID: \"2771f995-2e2b-48cb-a750-37d375badbcd\") " pod="openshift-insights/insights-runtime-extractor-6ghj7" Apr 21 01:52:00.695862 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.695710 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vztdh\" (UniqueName: \"kubernetes.io/projected/2771f995-2e2b-48cb-a750-37d375badbcd-kube-api-access-vztdh\") pod \"insights-runtime-extractor-6ghj7\" (UID: \"2771f995-2e2b-48cb-a750-37d375badbcd\") " pod="openshift-insights/insights-runtime-extractor-6ghj7" Apr 21 01:52:00.695862 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.695730 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2771f995-2e2b-48cb-a750-37d375badbcd-data-volume\") pod \"insights-runtime-extractor-6ghj7\" (UID: \"2771f995-2e2b-48cb-a750-37d375badbcd\") " pod="openshift-insights/insights-runtime-extractor-6ghj7" Apr 21 01:52:00.695862 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.695748 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2771f995-2e2b-48cb-a750-37d375badbcd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6ghj7\" (UID: \"2771f995-2e2b-48cb-a750-37d375badbcd\") " pod="openshift-insights/insights-runtime-extractor-6ghj7" Apr 21 01:52:00.695862 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.695766 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2771f995-2e2b-48cb-a750-37d375badbcd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6ghj7\" (UID: \"2771f995-2e2b-48cb-a750-37d375badbcd\") " pod="openshift-insights/insights-runtime-extractor-6ghj7" Apr 21 01:52:00.796866 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.796804 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2771f995-2e2b-48cb-a750-37d375badbcd-crio-socket\") pod \"insights-runtime-extractor-6ghj7\" (UID: \"2771f995-2e2b-48cb-a750-37d375badbcd\") " pod="openshift-insights/insights-runtime-extractor-6ghj7" Apr 21 01:52:00.797050 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.796890 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vztdh\" (UniqueName: \"kubernetes.io/projected/2771f995-2e2b-48cb-a750-37d375badbcd-kube-api-access-vztdh\") pod \"insights-runtime-extractor-6ghj7\" (UID: \"2771f995-2e2b-48cb-a750-37d375badbcd\") " pod="openshift-insights/insights-runtime-extractor-6ghj7" Apr 21 01:52:00.797050 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.796911 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2771f995-2e2b-48cb-a750-37d375badbcd-data-volume\") pod \"insights-runtime-extractor-6ghj7\" (UID: \"2771f995-2e2b-48cb-a750-37d375badbcd\") " pod="openshift-insights/insights-runtime-extractor-6ghj7" Apr 21 01:52:00.797050 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.796931 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2771f995-2e2b-48cb-a750-37d375badbcd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6ghj7\" (UID: \"2771f995-2e2b-48cb-a750-37d375badbcd\") " pod="openshift-insights/insights-runtime-extractor-6ghj7" Apr 21 01:52:00.797050 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.796932 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2771f995-2e2b-48cb-a750-37d375badbcd-crio-socket\") pod \"insights-runtime-extractor-6ghj7\" (UID: \"2771f995-2e2b-48cb-a750-37d375badbcd\") " pod="openshift-insights/insights-runtime-extractor-6ghj7" Apr 21 01:52:00.797050 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.796947 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2771f995-2e2b-48cb-a750-37d375badbcd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6ghj7\" (UID: \"2771f995-2e2b-48cb-a750-37d375badbcd\") " pod="openshift-insights/insights-runtime-extractor-6ghj7" Apr 21 01:52:00.797350 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.797330 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2771f995-2e2b-48cb-a750-37d375badbcd-data-volume\") pod \"insights-runtime-extractor-6ghj7\" (UID: \"2771f995-2e2b-48cb-a750-37d375badbcd\") " pod="openshift-insights/insights-runtime-extractor-6ghj7" Apr 21 01:52:00.797446 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.797432 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2771f995-2e2b-48cb-a750-37d375badbcd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6ghj7\" (UID: \"2771f995-2e2b-48cb-a750-37d375badbcd\") " pod="openshift-insights/insights-runtime-extractor-6ghj7" Apr 21 01:52:00.799268 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.799245 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2771f995-2e2b-48cb-a750-37d375badbcd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6ghj7\" (UID: \"2771f995-2e2b-48cb-a750-37d375badbcd\") " pod="openshift-insights/insights-runtime-extractor-6ghj7" Apr 21 01:52:00.804879 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.804855 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vztdh\" (UniqueName: \"kubernetes.io/projected/2771f995-2e2b-48cb-a750-37d375badbcd-kube-api-access-vztdh\") pod \"insights-runtime-extractor-6ghj7\" (UID: \"2771f995-2e2b-48cb-a750-37d375badbcd\") " pod="openshift-insights/insights-runtime-extractor-6ghj7" Apr 21 01:52:00.928986 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:00.928953 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6ghj7" Apr 21 01:52:01.049117 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.049081 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6ghj7"] Apr 21 01:52:01.053088 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:52:01.053061 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2771f995_2e2b_48cb_a750_37d375badbcd.slice/crio-b005dd0404be28e5079de04676c63198baef1f332e6e4d1a4e211f1f396ddb97 WatchSource:0}: Error finding container b005dd0404be28e5079de04676c63198baef1f332e6e4d1a4e211f1f396ddb97: Status 404 returned error can't find the container with id b005dd0404be28e5079de04676c63198baef1f332e6e4d1a4e211f1f396ddb97 Apr 21 01:52:01.477213 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.477179 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6ghj7" event={"ID":"2771f995-2e2b-48cb-a750-37d375badbcd","Type":"ContainerStarted","Data":"41cae6c0cf022d6a573937bcc6d1cd6aa561e5e3569e1165b426c08bd9d4e7b1"} Apr 21 01:52:01.477213 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.477207 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:52:01.477213 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.477218 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6ghj7" event={"ID":"2771f995-2e2b-48cb-a750-37d375badbcd","Type":"ContainerStarted","Data":"b005dd0404be28e5079de04676c63198baef1f332e6e4d1a4e211f1f396ddb97"} Apr 21 01:52:01.481026 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.481008 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:52:01.604995 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.604956 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0da3286b-0bd1-4daa-bf87-f363e2dc995d-installation-pull-secrets\") pod \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " Apr 21 01:52:01.605173 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.605011 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-bound-sa-token\") pod \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " Apr 21 01:52:01.605173 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.605040 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0da3286b-0bd1-4daa-bf87-f363e2dc995d-image-registry-private-configuration\") pod \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " Apr 21 01:52:01.605173 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.605066 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-certificates\") pod \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " Apr 21 01:52:01.605173 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.605148 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0da3286b-0bd1-4daa-bf87-f363e2dc995d-trusted-ca\") pod \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " Apr 21 01:52:01.605392 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.605182 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2thm\" (UniqueName: \"kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-kube-api-access-h2thm\") pod \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " Apr 21 01:52:01.605392 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.605224 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0da3286b-0bd1-4daa-bf87-f363e2dc995d-ca-trust-extracted\") pod \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\" (UID: \"0da3286b-0bd1-4daa-bf87-f363e2dc995d\") " Apr 21 01:52:01.605677 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.605651 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0da3286b-0bd1-4daa-bf87-f363e2dc995d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0da3286b-0bd1-4daa-bf87-f363e2dc995d" (UID: "0da3286b-0bd1-4daa-bf87-f363e2dc995d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:52:01.605800 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.605734 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0da3286b-0bd1-4daa-bf87-f363e2dc995d" (UID: "0da3286b-0bd1-4daa-bf87-f363e2dc995d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:52:01.606019 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.605998 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0da3286b-0bd1-4daa-bf87-f363e2dc995d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0da3286b-0bd1-4daa-bf87-f363e2dc995d" (UID: "0da3286b-0bd1-4daa-bf87-f363e2dc995d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:52:01.607468 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.607444 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0da3286b-0bd1-4daa-bf87-f363e2dc995d" (UID: "0da3286b-0bd1-4daa-bf87-f363e2dc995d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:52:01.607577 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.607552 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-kube-api-access-h2thm" (OuterVolumeSpecName: "kube-api-access-h2thm") pod "0da3286b-0bd1-4daa-bf87-f363e2dc995d" (UID: "0da3286b-0bd1-4daa-bf87-f363e2dc995d"). InnerVolumeSpecName "kube-api-access-h2thm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:52:01.607640 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.607576 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0da3286b-0bd1-4daa-bf87-f363e2dc995d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0da3286b-0bd1-4daa-bf87-f363e2dc995d" (UID: "0da3286b-0bd1-4daa-bf87-f363e2dc995d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 01:52:01.607682 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.607630 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0da3286b-0bd1-4daa-bf87-f363e2dc995d-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "0da3286b-0bd1-4daa-bf87-f363e2dc995d" (UID: "0da3286b-0bd1-4daa-bf87-f363e2dc995d"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 01:52:01.706590 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.706564 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0da3286b-0bd1-4daa-bf87-f363e2dc995d-trusted-ca\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:52:01.706590 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.706588 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h2thm\" (UniqueName: \"kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-kube-api-access-h2thm\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:52:01.707037 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.706597 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0da3286b-0bd1-4daa-bf87-f363e2dc995d-ca-trust-extracted\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:52:01.707037 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.706606 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0da3286b-0bd1-4daa-bf87-f363e2dc995d-installation-pull-secrets\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:52:01.707037 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.706615 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-bound-sa-token\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:52:01.707037 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.706624 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0da3286b-0bd1-4daa-bf87-f363e2dc995d-image-registry-private-configuration\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:52:01.707037 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:01.706633 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-certificates\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:52:02.481559 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:02.481519 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6ghj7" event={"ID":"2771f995-2e2b-48cb-a750-37d375badbcd","Type":"ContainerStarted","Data":"278426ac1ace1a69f30e765a517915a46676fb6fcc5d56826470dd865aba5067"} Apr 21 01:52:02.481559 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:02.481550 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56bfd878b7-v67s6" Apr 21 01:52:02.514066 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:02.514023 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-56bfd878b7-v67s6"] Apr 21 01:52:02.519286 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:02.519259 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-56bfd878b7-v67s6"] Apr 21 01:52:02.613829 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:02.613784 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0da3286b-0bd1-4daa-bf87-f363e2dc995d-registry-tls\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:52:03.062261 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:03.062230 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0da3286b-0bd1-4daa-bf87-f363e2dc995d" path="/var/lib/kubelet/pods/0da3286b-0bd1-4daa-bf87-f363e2dc995d/volumes" Apr 21 01:52:03.485405 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:03.485365 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6ghj7" event={"ID":"2771f995-2e2b-48cb-a750-37d375badbcd","Type":"ContainerStarted","Data":"5a59aeabf3a5d8e93028005579d959aec571c1581e1a9de309ebc85cad80010a"} Apr 21 01:52:03.502342 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:03.502284 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6ghj7" podStartSLOduration=1.621051161 podStartE2EDuration="3.5022657s" podCreationTimestamp="2026-04-21 01:52:00 +0000 UTC" firstStartedPulling="2026-04-21 01:52:01.108435063 +0000 UTC m=+118.687737102" lastFinishedPulling="2026-04-21 01:52:02.989649596 +0000 UTC m=+120.568951641" observedRunningTime="2026-04-21 01:52:03.502130969 +0000 UTC m=+121.081433026" watchObservedRunningTime="2026-04-21 01:52:03.5022657 +0000 UTC m=+121.081567759" Apr 21 01:52:04.226239 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:04.226204 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-metrics-certs\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:52:04.226622 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:04.226273 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70601297-057d-42c8-bef0-315d6797ccfd-service-ca-bundle\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:52:04.226961 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:04.226943 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70601297-057d-42c8-bef0-315d6797ccfd-service-ca-bundle\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:52:04.228526 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:04.228495 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/70601297-057d-42c8-bef0-315d6797ccfd-metrics-certs\") pod \"router-default-65f9585684-glq78\" (UID: \"70601297-057d-42c8-bef0-315d6797ccfd\") " pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:52:04.479139 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:04.479053 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-9t7wd\"" Apr 21 01:52:04.486751 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:04.486727 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:52:04.605399 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:04.605367 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-65f9585684-glq78"] Apr 21 01:52:04.608238 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:52:04.608206 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70601297_057d_42c8_bef0_315d6797ccfd.slice/crio-030ac2ba7bad30813c7a93f5ad1382b2056d8ec571e6d2461594229e672395bb WatchSource:0}: Error finding container 030ac2ba7bad30813c7a93f5ad1382b2056d8ec571e6d2461594229e672395bb: Status 404 returned error can't find the container with id 030ac2ba7bad30813c7a93f5ad1382b2056d8ec571e6d2461594229e672395bb Apr 21 01:52:05.491438 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:05.491406 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-65f9585684-glq78" event={"ID":"70601297-057d-42c8-bef0-315d6797ccfd","Type":"ContainerStarted","Data":"cec61887f85d2e7b5d648c0941dd4f1d287ac1b88300d01c9914e3a92a360772"} Apr 21 01:52:05.491438 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:05.491444 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-65f9585684-glq78" event={"ID":"70601297-057d-42c8-bef0-315d6797ccfd","Type":"ContainerStarted","Data":"030ac2ba7bad30813c7a93f5ad1382b2056d8ec571e6d2461594229e672395bb"} Apr 21 01:52:05.507291 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:05.507231 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-65f9585684-glq78" podStartSLOduration=33.50721596 podStartE2EDuration="33.50721596s" podCreationTimestamp="2026-04-21 01:51:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 01:52:05.506906261 +0000 UTC m=+123.086208319" watchObservedRunningTime="2026-04-21 01:52:05.50721596 +0000 UTC m=+123.086518046" Apr 21 01:52:05.903669 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:05.903637 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6dbfbfc9c4-4gx44"] Apr 21 01:52:05.906769 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:05.906752 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:05.909526 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:05.909495 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 01:52:05.909666 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:05.909619 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 01:52:05.909887 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:05.909864 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 01:52:05.910239 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:05.910217 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-z9cqm\"" Apr 21 01:52:05.910701 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:05.910678 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 01:52:05.910995 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:05.910977 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 01:52:05.911071 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:05.911023 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 01:52:05.911790 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:05.911768 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 01:52:05.916709 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:05.916687 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dbfbfc9c4-4gx44"] Apr 21 01:52:06.040029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.039987 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-service-ca\") pod \"console-6dbfbfc9c4-4gx44\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:06.040029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.040030 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sqgb\" (UniqueName: \"kubernetes.io/projected/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-kube-api-access-8sqgb\") pod \"console-6dbfbfc9c4-4gx44\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:06.040290 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.040063 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-console-config\") pod \"console-6dbfbfc9c4-4gx44\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:06.040290 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.040138 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-oauth-serving-cert\") pod \"console-6dbfbfc9c4-4gx44\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:06.040290 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.040209 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-console-oauth-config\") pod \"console-6dbfbfc9c4-4gx44\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:06.040416 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.040299 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-console-serving-cert\") pod \"console-6dbfbfc9c4-4gx44\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:06.141272 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.141233 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-oauth-serving-cert\") pod \"console-6dbfbfc9c4-4gx44\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:06.141420 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.141307 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-console-oauth-config\") pod \"console-6dbfbfc9c4-4gx44\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:06.141420 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.141392 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-console-serving-cert\") pod \"console-6dbfbfc9c4-4gx44\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:06.141529 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.141424 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-service-ca\") pod \"console-6dbfbfc9c4-4gx44\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:06.141529 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.141450 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8sqgb\" (UniqueName: \"kubernetes.io/projected/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-kube-api-access-8sqgb\") pod \"console-6dbfbfc9c4-4gx44\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:06.141529 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.141477 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-console-config\") pod \"console-6dbfbfc9c4-4gx44\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:06.142188 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.142160 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-console-config\") pod \"console-6dbfbfc9c4-4gx44\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:06.142303 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.142239 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-service-ca\") pod \"console-6dbfbfc9c4-4gx44\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:06.142716 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.142686 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-oauth-serving-cert\") pod \"console-6dbfbfc9c4-4gx44\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:06.143969 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.143948 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-console-serving-cert\") pod \"console-6dbfbfc9c4-4gx44\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:06.143969 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.143957 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-console-oauth-config\") pod \"console-6dbfbfc9c4-4gx44\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:06.150020 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.149998 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sqgb\" (UniqueName: \"kubernetes.io/projected/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-kube-api-access-8sqgb\") pod \"console-6dbfbfc9c4-4gx44\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:06.219291 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.219205 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:06.334332 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.334302 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dbfbfc9c4-4gx44"] Apr 21 01:52:06.337295 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:52:06.337265 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aa4d1d4_0b57_47cc_92a7_f6bd34b18794.slice/crio-76e5ae957e7cd01ab36ac8dd435c8b3b9cb2e5a7f4e42b43f5be0b626dc392c9 WatchSource:0}: Error finding container 76e5ae957e7cd01ab36ac8dd435c8b3b9cb2e5a7f4e42b43f5be0b626dc392c9: Status 404 returned error can't find the container with id 76e5ae957e7cd01ab36ac8dd435c8b3b9cb2e5a7f4e42b43f5be0b626dc392c9 Apr 21 01:52:06.486970 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.486891 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:52:06.489532 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.489509 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:52:06.494403 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.494377 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dbfbfc9c4-4gx44" event={"ID":"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794","Type":"ContainerStarted","Data":"76e5ae957e7cd01ab36ac8dd435c8b3b9cb2e5a7f4e42b43f5be0b626dc392c9"} Apr 21 01:52:06.494690 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.494557 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:52:06.495615 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:06.495598 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-65f9585684-glq78" Apr 21 01:52:07.327604 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:07.327573 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pnf4r"] Apr 21 01:52:07.330726 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:07.330700 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pnf4r" Apr 21 01:52:07.333022 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:07.332991 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-l7r7z\"" Apr 21 01:52:07.333138 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:07.333024 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 21 01:52:07.341068 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:07.341019 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pnf4r"] Apr 21 01:52:07.452433 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:07.452390 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/84f2628a-fb6e-4827-bbeb-2a4ae45d0b29-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-pnf4r\" (UID: \"84f2628a-fb6e-4827-bbeb-2a4ae45d0b29\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pnf4r" Apr 21 01:52:07.553503 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:07.553465 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/84f2628a-fb6e-4827-bbeb-2a4ae45d0b29-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-pnf4r\" (UID: \"84f2628a-fb6e-4827-bbeb-2a4ae45d0b29\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pnf4r" Apr 21 01:52:07.556245 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:07.556218 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/84f2628a-fb6e-4827-bbeb-2a4ae45d0b29-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-pnf4r\" (UID: \"84f2628a-fb6e-4827-bbeb-2a4ae45d0b29\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pnf4r" Apr 21 01:52:07.642775 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:07.642741 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pnf4r" Apr 21 01:52:07.803992 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:07.803952 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pnf4r"] Apr 21 01:52:07.807287 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:52:07.807257 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84f2628a_fb6e_4827_bbeb_2a4ae45d0b29.slice/crio-4ee42fd62082920ddffc1cbea43438e20867b121a4fddfc1af84972fce96868c WatchSource:0}: Error finding container 4ee42fd62082920ddffc1cbea43438e20867b121a4fddfc1af84972fce96868c: Status 404 returned error can't find the container with id 4ee42fd62082920ddffc1cbea43438e20867b121a4fddfc1af84972fce96868c Apr 21 01:52:08.500806 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:08.500756 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pnf4r" event={"ID":"84f2628a-fb6e-4827-bbeb-2a4ae45d0b29","Type":"ContainerStarted","Data":"4ee42fd62082920ddffc1cbea43438e20867b121a4fddfc1af84972fce96868c"} Apr 21 01:52:09.508830 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:09.508770 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dbfbfc9c4-4gx44" event={"ID":"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794","Type":"ContainerStarted","Data":"0071707dc2b740cd875c0542b49744294fbae1fb52d03a1f69e465c786762db8"} Apr 21 01:52:09.525317 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:09.525266 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6dbfbfc9c4-4gx44" podStartSLOduration=1.807255899 podStartE2EDuration="4.525249245s" podCreationTimestamp="2026-04-21 01:52:05 +0000 UTC" firstStartedPulling="2026-04-21 01:52:06.339341969 +0000 UTC m=+123.918644006" lastFinishedPulling="2026-04-21 01:52:09.057335303 +0000 UTC m=+126.636637352" observedRunningTime="2026-04-21 01:52:09.524127254 +0000 UTC m=+127.103429312" watchObservedRunningTime="2026-04-21 01:52:09.525249245 +0000 UTC m=+127.104551304" Apr 21 01:52:10.512108 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:10.512068 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pnf4r" event={"ID":"84f2628a-fb6e-4827-bbeb-2a4ae45d0b29","Type":"ContainerStarted","Data":"c59e79a64c59168709d1c26e15d25d95e27ece51b94dad7599787c353598afe8"} Apr 21 01:52:10.512476 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:10.512355 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pnf4r" Apr 21 01:52:10.516934 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:10.516907 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pnf4r" Apr 21 01:52:10.529251 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:10.529194 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-pnf4r" podStartSLOduration=1.535973171 podStartE2EDuration="3.529177826s" podCreationTimestamp="2026-04-21 01:52:07 +0000 UTC" firstStartedPulling="2026-04-21 01:52:07.809470478 +0000 UTC m=+125.388772515" lastFinishedPulling="2026-04-21 01:52:09.802675135 +0000 UTC m=+127.381977170" observedRunningTime="2026-04-21 01:52:10.528468445 +0000 UTC m=+128.107770503" watchObservedRunningTime="2026-04-21 01:52:10.529177826 +0000 UTC m=+128.108479884" Apr 21 01:52:11.383972 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.383941 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-2vpnk"] Apr 21 01:52:11.387058 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.387039 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-2vpnk" Apr 21 01:52:11.390488 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.390462 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 21 01:52:11.390632 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.390498 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 01:52:11.390632 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.390497 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 01:52:11.390749 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.390695 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 21 01:52:11.390749 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.390733 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-g6bmg\"" Apr 21 01:52:11.390855 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.390768 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 01:52:11.395649 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.395629 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-2vpnk"] Apr 21 01:52:11.487399 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.487359 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxv9v\" (UniqueName: \"kubernetes.io/projected/820f6e10-eab2-4509-ab5a-c5aca3e4b769-kube-api-access-mxv9v\") pod \"prometheus-operator-5676c8c784-2vpnk\" (UID: \"820f6e10-eab2-4509-ab5a-c5aca3e4b769\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2vpnk" Apr 21 01:52:11.487588 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.487465 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/820f6e10-eab2-4509-ab5a-c5aca3e4b769-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-2vpnk\" (UID: \"820f6e10-eab2-4509-ab5a-c5aca3e4b769\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2vpnk" Apr 21 01:52:11.487588 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.487501 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/820f6e10-eab2-4509-ab5a-c5aca3e4b769-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-2vpnk\" (UID: \"820f6e10-eab2-4509-ab5a-c5aca3e4b769\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2vpnk" Apr 21 01:52:11.487588 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.487521 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/820f6e10-eab2-4509-ab5a-c5aca3e4b769-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-2vpnk\" (UID: \"820f6e10-eab2-4509-ab5a-c5aca3e4b769\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2vpnk" Apr 21 01:52:11.588397 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.588363 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/820f6e10-eab2-4509-ab5a-c5aca3e4b769-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-2vpnk\" (UID: \"820f6e10-eab2-4509-ab5a-c5aca3e4b769\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2vpnk" Apr 21 01:52:11.588397 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.588411 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/820f6e10-eab2-4509-ab5a-c5aca3e4b769-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-2vpnk\" (UID: \"820f6e10-eab2-4509-ab5a-c5aca3e4b769\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2vpnk" Apr 21 01:52:11.588941 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.588435 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/820f6e10-eab2-4509-ab5a-c5aca3e4b769-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-2vpnk\" (UID: \"820f6e10-eab2-4509-ab5a-c5aca3e4b769\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2vpnk" Apr 21 01:52:11.588941 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.588456 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxv9v\" (UniqueName: \"kubernetes.io/projected/820f6e10-eab2-4509-ab5a-c5aca3e4b769-kube-api-access-mxv9v\") pod \"prometheus-operator-5676c8c784-2vpnk\" (UID: \"820f6e10-eab2-4509-ab5a-c5aca3e4b769\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2vpnk" Apr 21 01:52:11.588941 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:52:11.588546 2573 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 21 01:52:11.588941 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:52:11.588625 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/820f6e10-eab2-4509-ab5a-c5aca3e4b769-prometheus-operator-tls podName:820f6e10-eab2-4509-ab5a-c5aca3e4b769 nodeName:}" failed. No retries permitted until 2026-04-21 01:52:12.088605734 +0000 UTC m=+129.667907769 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/820f6e10-eab2-4509-ab5a-c5aca3e4b769-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-2vpnk" (UID: "820f6e10-eab2-4509-ab5a-c5aca3e4b769") : secret "prometheus-operator-tls" not found Apr 21 01:52:11.589102 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.589083 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/820f6e10-eab2-4509-ab5a-c5aca3e4b769-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-2vpnk\" (UID: \"820f6e10-eab2-4509-ab5a-c5aca3e4b769\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2vpnk" Apr 21 01:52:11.591022 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.591003 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/820f6e10-eab2-4509-ab5a-c5aca3e4b769-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-2vpnk\" (UID: \"820f6e10-eab2-4509-ab5a-c5aca3e4b769\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2vpnk" Apr 21 01:52:11.597087 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.597061 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxv9v\" (UniqueName: \"kubernetes.io/projected/820f6e10-eab2-4509-ab5a-c5aca3e4b769-kube-api-access-mxv9v\") pod \"prometheus-operator-5676c8c784-2vpnk\" (UID: \"820f6e10-eab2-4509-ab5a-c5aca3e4b769\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2vpnk" Apr 21 01:52:11.891412 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.891374 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs\") pod \"network-metrics-daemon-mfs4c\" (UID: \"9c103689-40cc-470b-9109-33a63ff6f5dd\") " pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:52:11.893665 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:11.893637 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9c103689-40cc-470b-9109-33a63ff6f5dd-metrics-certs\") pod \"network-metrics-daemon-mfs4c\" (UID: \"9c103689-40cc-470b-9109-33a63ff6f5dd\") " pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:52:12.092993 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:12.092938 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/820f6e10-eab2-4509-ab5a-c5aca3e4b769-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-2vpnk\" (UID: \"820f6e10-eab2-4509-ab5a-c5aca3e4b769\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2vpnk" Apr 21 01:52:12.095958 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:12.095939 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/820f6e10-eab2-4509-ab5a-c5aca3e4b769-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-2vpnk\" (UID: \"820f6e10-eab2-4509-ab5a-c5aca3e4b769\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2vpnk" Apr 21 01:52:12.182918 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:12.182800 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-68xr4\"" Apr 21 01:52:12.190929 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:12.190901 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mfs4c" Apr 21 01:52:12.296234 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:12.296193 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-2vpnk" Apr 21 01:52:12.306849 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:12.306808 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mfs4c"] Apr 21 01:52:12.310470 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:52:12.310444 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c103689_40cc_470b_9109_33a63ff6f5dd.slice/crio-9519a8ccef1025bb372a74c3876187c946ed52b725f8e38382f0142aa89d86a6 WatchSource:0}: Error finding container 9519a8ccef1025bb372a74c3876187c946ed52b725f8e38382f0142aa89d86a6: Status 404 returned error can't find the container with id 9519a8ccef1025bb372a74c3876187c946ed52b725f8e38382f0142aa89d86a6 Apr 21 01:52:12.411341 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:12.411309 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-2vpnk"] Apr 21 01:52:12.414028 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:52:12.414001 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod820f6e10_eab2_4509_ab5a_c5aca3e4b769.slice/crio-92acecb09ac2b99527860d7606406d1d5c4f550443bc455d60bcf1e7bf13486b WatchSource:0}: Error finding container 92acecb09ac2b99527860d7606406d1d5c4f550443bc455d60bcf1e7bf13486b: Status 404 returned error can't find the container with id 92acecb09ac2b99527860d7606406d1d5c4f550443bc455d60bcf1e7bf13486b Apr 21 01:52:12.518601 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:12.518502 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-2vpnk" event={"ID":"820f6e10-eab2-4509-ab5a-c5aca3e4b769","Type":"ContainerStarted","Data":"92acecb09ac2b99527860d7606406d1d5c4f550443bc455d60bcf1e7bf13486b"} Apr 21 01:52:12.519473 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:12.519452 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mfs4c" event={"ID":"9c103689-40cc-470b-9109-33a63ff6f5dd","Type":"ContainerStarted","Data":"9519a8ccef1025bb372a74c3876187c946ed52b725f8e38382f0142aa89d86a6"} Apr 21 01:52:13.524953 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:13.524872 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mfs4c" event={"ID":"9c103689-40cc-470b-9109-33a63ff6f5dd","Type":"ContainerStarted","Data":"7456b349cd687a2051a1eeae3e25d44918dfe6d0e17d91e6e14d5433b7629209"} Apr 21 01:52:13.895041 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:13.895005 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-868c6cc5cf-wqxfb"] Apr 21 01:52:13.898053 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:13.898035 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:13.905253 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:13.905226 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 01:52:13.907983 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:13.907957 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-868c6cc5cf-wqxfb"] Apr 21 01:52:14.008085 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.008043 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b734ae0-7dff-4e82-b981-5c23af27b113-console-serving-cert\") pod \"console-868c6cc5cf-wqxfb\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.008085 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.008087 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-oauth-serving-cert\") pod \"console-868c6cc5cf-wqxfb\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.008292 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.008118 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtsnq\" (UniqueName: \"kubernetes.io/projected/6b734ae0-7dff-4e82-b981-5c23af27b113-kube-api-access-dtsnq\") pod \"console-868c6cc5cf-wqxfb\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.008292 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.008149 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b734ae0-7dff-4e82-b981-5c23af27b113-console-oauth-config\") pod \"console-868c6cc5cf-wqxfb\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.008292 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.008205 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-service-ca\") pod \"console-868c6cc5cf-wqxfb\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.008292 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.008224 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-trusted-ca-bundle\") pod \"console-868c6cc5cf-wqxfb\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.008292 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.008269 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-console-config\") pod \"console-868c6cc5cf-wqxfb\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.108970 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.108879 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b734ae0-7dff-4e82-b981-5c23af27b113-console-serving-cert\") pod \"console-868c6cc5cf-wqxfb\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.108970 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.108938 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-oauth-serving-cert\") pod \"console-868c6cc5cf-wqxfb\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.108970 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.108968 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtsnq\" (UniqueName: \"kubernetes.io/projected/6b734ae0-7dff-4e82-b981-5c23af27b113-kube-api-access-dtsnq\") pod \"console-868c6cc5cf-wqxfb\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.109174 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.109001 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b734ae0-7dff-4e82-b981-5c23af27b113-console-oauth-config\") pod \"console-868c6cc5cf-wqxfb\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.109174 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.109035 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-service-ca\") pod \"console-868c6cc5cf-wqxfb\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.109174 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.109063 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-trusted-ca-bundle\") pod \"console-868c6cc5cf-wqxfb\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.109174 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.109124 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-console-config\") pod \"console-868c6cc5cf-wqxfb\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.109770 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.109741 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-service-ca\") pod \"console-868c6cc5cf-wqxfb\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.109974 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.109950 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-trusted-ca-bundle\") pod \"console-868c6cc5cf-wqxfb\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.110044 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.109950 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-console-config\") pod \"console-868c6cc5cf-wqxfb\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.110044 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.109955 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-oauth-serving-cert\") pod \"console-868c6cc5cf-wqxfb\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.111488 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.111465 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b734ae0-7dff-4e82-b981-5c23af27b113-console-oauth-config\") pod \"console-868c6cc5cf-wqxfb\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.111586 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.111510 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b734ae0-7dff-4e82-b981-5c23af27b113-console-serving-cert\") pod \"console-868c6cc5cf-wqxfb\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.117102 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.117080 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtsnq\" (UniqueName: \"kubernetes.io/projected/6b734ae0-7dff-4e82-b981-5c23af27b113-kube-api-access-dtsnq\") pod \"console-868c6cc5cf-wqxfb\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.207865 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.207805 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:14.325415 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.325088 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-868c6cc5cf-wqxfb"] Apr 21 01:52:14.327634 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:52:14.327596 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b734ae0_7dff_4e82_b981_5c23af27b113.slice/crio-72de14efddb0da4cf735941fc49731eeee2647384e5ed6eb9d3a115ebe6cb998 WatchSource:0}: Error finding container 72de14efddb0da4cf735941fc49731eeee2647384e5ed6eb9d3a115ebe6cb998: Status 404 returned error can't find the container with id 72de14efddb0da4cf735941fc49731eeee2647384e5ed6eb9d3a115ebe6cb998 Apr 21 01:52:14.533549 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.533510 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-2vpnk" event={"ID":"820f6e10-eab2-4509-ab5a-c5aca3e4b769","Type":"ContainerStarted","Data":"3d4d5809022db12fbb307002d657a78dcae351f50a337eb978415514a0023818"} Apr 21 01:52:14.533549 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.533555 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-2vpnk" event={"ID":"820f6e10-eab2-4509-ab5a-c5aca3e4b769","Type":"ContainerStarted","Data":"21dba47262d194651d540a951dead246bdd780c7519c611ab13305bfebdcb5eb"} Apr 21 01:52:14.535003 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.534975 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mfs4c" event={"ID":"9c103689-40cc-470b-9109-33a63ff6f5dd","Type":"ContainerStarted","Data":"f16f96aa96891278838a7395c159bb9601a0a9d661d53a6721dda42b6bed46cd"} Apr 21 01:52:14.536214 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.536189 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-868c6cc5cf-wqxfb" event={"ID":"6b734ae0-7dff-4e82-b981-5c23af27b113","Type":"ContainerStarted","Data":"3cd371062a3e430258d7d43adbc5b64a7595a00939e68dc68d45290ee332707a"} Apr 21 01:52:14.536302 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.536222 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-868c6cc5cf-wqxfb" event={"ID":"6b734ae0-7dff-4e82-b981-5c23af27b113","Type":"ContainerStarted","Data":"72de14efddb0da4cf735941fc49731eeee2647384e5ed6eb9d3a115ebe6cb998"} Apr 21 01:52:14.550502 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.550453 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-2vpnk" podStartSLOduration=2.294428205 podStartE2EDuration="3.550439934s" podCreationTimestamp="2026-04-21 01:52:11 +0000 UTC" firstStartedPulling="2026-04-21 01:52:12.41579119 +0000 UTC m=+129.995093226" lastFinishedPulling="2026-04-21 01:52:13.671802902 +0000 UTC m=+131.251104955" observedRunningTime="2026-04-21 01:52:14.549848481 +0000 UTC m=+132.129150548" watchObservedRunningTime="2026-04-21 01:52:14.550439934 +0000 UTC m=+132.129742009" Apr 21 01:52:14.564182 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.564124 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mfs4c" podStartSLOduration=130.630296632 podStartE2EDuration="2m11.56410655s" podCreationTimestamp="2026-04-21 01:50:03 +0000 UTC" firstStartedPulling="2026-04-21 01:52:12.312263432 +0000 UTC m=+129.891565467" lastFinishedPulling="2026-04-21 01:52:13.246073335 +0000 UTC m=+130.825375385" observedRunningTime="2026-04-21 01:52:14.563443174 +0000 UTC m=+132.142745231" watchObservedRunningTime="2026-04-21 01:52:14.56410655 +0000 UTC m=+132.143408632" Apr 21 01:52:14.579271 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:14.579217 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-868c6cc5cf-wqxfb" podStartSLOduration=1.579199187 podStartE2EDuration="1.579199187s" podCreationTimestamp="2026-04-21 01:52:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 01:52:14.579023224 +0000 UTC m=+132.158325308" watchObservedRunningTime="2026-04-21 01:52:14.579199187 +0000 UTC m=+132.158501244" Apr 21 01:52:16.219591 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.219549 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:16.219591 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.219587 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:16.224219 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.224196 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:16.544966 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.544876 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:16.713276 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.713237 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn"] Apr 21 01:52:16.717199 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.717174 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn" Apr 21 01:52:16.720534 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.720494 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 21 01:52:16.720906 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.720512 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-sk8wg\"" Apr 21 01:52:16.721043 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.720539 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 21 01:52:16.723144 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.723124 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-bklrp"] Apr 21 01:52:16.727317 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.727301 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" Apr 21 01:52:16.727931 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.727908 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn"] Apr 21 01:52:16.730237 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.730151 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-88cnn\"" Apr 21 01:52:16.730503 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.730483 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 21 01:52:16.730624 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.730595 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 21 01:52:16.730761 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.730745 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 21 01:52:16.737699 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.737677 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-npfqq"] Apr 21 01:52:16.747425 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.747396 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-bklrp"] Apr 21 01:52:16.747787 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.747769 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:16.751481 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.751456 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 01:52:16.751605 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.751484 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 01:52:16.751756 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.751730 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2rf2h\"" Apr 21 01:52:16.751984 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.751967 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 01:52:16.830915 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.830806 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bw9h\" (UniqueName: \"kubernetes.io/projected/48803557-e6ce-4aa0-8b91-f591cb28551b-kube-api-access-4bw9h\") pod \"openshift-state-metrics-9d44df66c-gptvn\" (UID: \"48803557-e6ce-4aa0-8b91-f591cb28551b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn" Apr 21 01:52:16.830915 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.830877 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a77bc043-0862-487d-a125-8ad685a25aed-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-bklrp\" (UID: \"a77bc043-0862-487d-a125-8ad685a25aed\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" Apr 21 01:52:16.831118 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.830936 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48803557-e6ce-4aa0-8b91-f591cb28551b-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gptvn\" (UID: \"48803557-e6ce-4aa0-8b91-f591cb28551b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn" Apr 21 01:52:16.831118 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.831002 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/48803557-e6ce-4aa0-8b91-f591cb28551b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gptvn\" (UID: \"48803557-e6ce-4aa0-8b91-f591cb28551b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn" Apr 21 01:52:16.831118 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.831030 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/48803557-e6ce-4aa0-8b91-f591cb28551b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gptvn\" (UID: \"48803557-e6ce-4aa0-8b91-f591cb28551b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn" Apr 21 01:52:16.831118 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.831074 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a77bc043-0862-487d-a125-8ad685a25aed-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-bklrp\" (UID: \"a77bc043-0862-487d-a125-8ad685a25aed\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" Apr 21 01:52:16.831118 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.831113 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a77bc043-0862-487d-a125-8ad685a25aed-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-bklrp\" (UID: \"a77bc043-0862-487d-a125-8ad685a25aed\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" Apr 21 01:52:16.831271 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.831151 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a77bc043-0862-487d-a125-8ad685a25aed-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-bklrp\" (UID: \"a77bc043-0862-487d-a125-8ad685a25aed\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" Apr 21 01:52:16.831271 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.831214 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfs5j\" (UniqueName: \"kubernetes.io/projected/a77bc043-0862-487d-a125-8ad685a25aed-kube-api-access-pfs5j\") pod \"kube-state-metrics-69db897b98-bklrp\" (UID: \"a77bc043-0862-487d-a125-8ad685a25aed\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" Apr 21 01:52:16.831332 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.831266 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a77bc043-0862-487d-a125-8ad685a25aed-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-bklrp\" (UID: \"a77bc043-0862-487d-a125-8ad685a25aed\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" Apr 21 01:52:16.932143 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.932102 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d46793fb-d7db-4f84-895f-8b7c420d58ab-node-exporter-textfile\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:16.932143 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.932142 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d46793fb-d7db-4f84-895f-8b7c420d58ab-metrics-client-ca\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:16.932364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.932217 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a77bc043-0862-487d-a125-8ad685a25aed-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-bklrp\" (UID: \"a77bc043-0862-487d-a125-8ad685a25aed\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" Apr 21 01:52:16.932364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.932287 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a77bc043-0862-487d-a125-8ad685a25aed-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-bklrp\" (UID: \"a77bc043-0862-487d-a125-8ad685a25aed\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" Apr 21 01:52:16.932364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.932319 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pfs5j\" (UniqueName: \"kubernetes.io/projected/a77bc043-0862-487d-a125-8ad685a25aed-kube-api-access-pfs5j\") pod \"kube-state-metrics-69db897b98-bklrp\" (UID: \"a77bc043-0862-487d-a125-8ad685a25aed\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" Apr 21 01:52:16.932364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.932356 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a77bc043-0862-487d-a125-8ad685a25aed-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-bklrp\" (UID: \"a77bc043-0862-487d-a125-8ad685a25aed\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" Apr 21 01:52:16.932541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.932399 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bw9h\" (UniqueName: \"kubernetes.io/projected/48803557-e6ce-4aa0-8b91-f591cb28551b-kube-api-access-4bw9h\") pod \"openshift-state-metrics-9d44df66c-gptvn\" (UID: \"48803557-e6ce-4aa0-8b91-f591cb28551b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn" Apr 21 01:52:16.932541 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:52:16.932424 2573 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 21 01:52:16.932541 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:52:16.932494 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a77bc043-0862-487d-a125-8ad685a25aed-kube-state-metrics-tls podName:a77bc043-0862-487d-a125-8ad685a25aed nodeName:}" failed. No retries permitted until 2026-04-21 01:52:17.432473565 +0000 UTC m=+135.011775608 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/a77bc043-0862-487d-a125-8ad685a25aed-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-bklrp" (UID: "a77bc043-0862-487d-a125-8ad685a25aed") : secret "kube-state-metrics-tls" not found Apr 21 01:52:16.932710 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.932426 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a77bc043-0862-487d-a125-8ad685a25aed-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-bklrp\" (UID: \"a77bc043-0862-487d-a125-8ad685a25aed\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" Apr 21 01:52:16.932710 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.932627 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d46793fb-d7db-4f84-895f-8b7c420d58ab-root\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:16.932710 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.932655 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d46793fb-d7db-4f84-895f-8b7c420d58ab-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:16.932710 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.932688 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48803557-e6ce-4aa0-8b91-f591cb28551b-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gptvn\" (UID: \"48803557-e6ce-4aa0-8b91-f591cb28551b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn" Apr 21 01:52:16.932942 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.932714 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d46793fb-d7db-4f84-895f-8b7c420d58ab-sys\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:16.932942 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.932743 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhp2t\" (UniqueName: \"kubernetes.io/projected/d46793fb-d7db-4f84-895f-8b7c420d58ab-kube-api-access-nhp2t\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:16.932942 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.932826 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/48803557-e6ce-4aa0-8b91-f591cb28551b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gptvn\" (UID: \"48803557-e6ce-4aa0-8b91-f591cb28551b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn" Apr 21 01:52:16.932942 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.932861 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/48803557-e6ce-4aa0-8b91-f591cb28551b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gptvn\" (UID: \"48803557-e6ce-4aa0-8b91-f591cb28551b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn" Apr 21 01:52:16.932942 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.932891 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d46793fb-d7db-4f84-895f-8b7c420d58ab-node-exporter-tls\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:16.932942 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.932916 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d46793fb-d7db-4f84-895f-8b7c420d58ab-node-exporter-wtmp\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:16.932942 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.932923 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a77bc043-0862-487d-a125-8ad685a25aed-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-bklrp\" (UID: \"a77bc043-0862-487d-a125-8ad685a25aed\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" Apr 21 01:52:16.932942 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.932942 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d46793fb-d7db-4f84-895f-8b7c420d58ab-node-exporter-accelerators-collector-config\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:16.933323 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.933011 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a77bc043-0862-487d-a125-8ad685a25aed-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-bklrp\" (UID: \"a77bc043-0862-487d-a125-8ad685a25aed\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" Apr 21 01:52:16.933323 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:52:16.933255 2573 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 21 01:52:16.933323 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.933319 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a77bc043-0862-487d-a125-8ad685a25aed-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-bklrp\" (UID: \"a77bc043-0862-487d-a125-8ad685a25aed\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" Apr 21 01:52:16.933474 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:52:16.933370 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48803557-e6ce-4aa0-8b91-f591cb28551b-openshift-state-metrics-tls podName:48803557-e6ce-4aa0-8b91-f591cb28551b nodeName:}" failed. No retries permitted until 2026-04-21 01:52:17.433349921 +0000 UTC m=+135.012651974 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/48803557-e6ce-4aa0-8b91-f591cb28551b-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-gptvn" (UID: "48803557-e6ce-4aa0-8b91-f591cb28551b") : secret "openshift-state-metrics-tls" not found Apr 21 01:52:16.933594 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.933575 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/48803557-e6ce-4aa0-8b91-f591cb28551b-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gptvn\" (UID: \"48803557-e6ce-4aa0-8b91-f591cb28551b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn" Apr 21 01:52:16.935296 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.935267 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a77bc043-0862-487d-a125-8ad685a25aed-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-bklrp\" (UID: \"a77bc043-0862-487d-a125-8ad685a25aed\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" Apr 21 01:52:16.935607 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.935586 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/48803557-e6ce-4aa0-8b91-f591cb28551b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gptvn\" (UID: \"48803557-e6ce-4aa0-8b91-f591cb28551b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn" Apr 21 01:52:16.938155 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.938130 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a77bc043-0862-487d-a125-8ad685a25aed-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-bklrp\" (UID: \"a77bc043-0862-487d-a125-8ad685a25aed\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" Apr 21 01:52:16.942365 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.942339 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bw9h\" (UniqueName: \"kubernetes.io/projected/48803557-e6ce-4aa0-8b91-f591cb28551b-kube-api-access-4bw9h\") pod \"openshift-state-metrics-9d44df66c-gptvn\" (UID: \"48803557-e6ce-4aa0-8b91-f591cb28551b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn" Apr 21 01:52:16.942515 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:16.942493 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfs5j\" (UniqueName: \"kubernetes.io/projected/a77bc043-0862-487d-a125-8ad685a25aed-kube-api-access-pfs5j\") pod \"kube-state-metrics-69db897b98-bklrp\" (UID: \"a77bc043-0862-487d-a125-8ad685a25aed\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" Apr 21 01:52:17.033500 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.033467 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d46793fb-d7db-4f84-895f-8b7c420d58ab-root\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:17.033500 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.033506 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d46793fb-d7db-4f84-895f-8b7c420d58ab-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:17.033761 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.033524 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d46793fb-d7db-4f84-895f-8b7c420d58ab-sys\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:17.033761 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.033547 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhp2t\" (UniqueName: \"kubernetes.io/projected/d46793fb-d7db-4f84-895f-8b7c420d58ab-kube-api-access-nhp2t\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:17.033761 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.033587 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d46793fb-d7db-4f84-895f-8b7c420d58ab-root\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:17.033761 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.033619 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d46793fb-d7db-4f84-895f-8b7c420d58ab-node-exporter-tls\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:17.033761 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.033642 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d46793fb-d7db-4f84-895f-8b7c420d58ab-sys\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:17.033761 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.033644 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d46793fb-d7db-4f84-895f-8b7c420d58ab-node-exporter-wtmp\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:17.033761 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.033683 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d46793fb-d7db-4f84-895f-8b7c420d58ab-node-exporter-accelerators-collector-config\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:17.033761 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.033728 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d46793fb-d7db-4f84-895f-8b7c420d58ab-node-exporter-textfile\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:17.033761 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.033756 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d46793fb-d7db-4f84-895f-8b7c420d58ab-metrics-client-ca\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:17.034248 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.034217 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d46793fb-d7db-4f84-895f-8b7c420d58ab-node-exporter-textfile\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:17.034411 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.034376 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d46793fb-d7db-4f84-895f-8b7c420d58ab-node-exporter-accelerators-collector-config\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:17.034504 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.034434 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d46793fb-d7db-4f84-895f-8b7c420d58ab-metrics-client-ca\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:17.034594 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.034570 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d46793fb-d7db-4f84-895f-8b7c420d58ab-node-exporter-wtmp\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:17.036133 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.036105 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d46793fb-d7db-4f84-895f-8b7c420d58ab-node-exporter-tls\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:17.036339 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.036317 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d46793fb-d7db-4f84-895f-8b7c420d58ab-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:17.043040 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.043015 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhp2t\" (UniqueName: \"kubernetes.io/projected/d46793fb-d7db-4f84-895f-8b7c420d58ab-kube-api-access-nhp2t\") pod \"node-exporter-npfqq\" (UID: \"d46793fb-d7db-4f84-895f-8b7c420d58ab\") " pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:17.059236 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.059190 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-npfqq" Apr 21 01:52:17.069609 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:52:17.069577 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd46793fb_d7db_4f84_895f_8b7c420d58ab.slice/crio-00c510003cb6eddea81765754b396b9cc36078bf2fd0cf6d0c2846071377f492 WatchSource:0}: Error finding container 00c510003cb6eddea81765754b396b9cc36078bf2fd0cf6d0c2846071377f492: Status 404 returned error can't find the container with id 00c510003cb6eddea81765754b396b9cc36078bf2fd0cf6d0c2846071377f492 Apr 21 01:52:17.437803 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.437764 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/48803557-e6ce-4aa0-8b91-f591cb28551b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gptvn\" (UID: \"48803557-e6ce-4aa0-8b91-f591cb28551b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn" Apr 21 01:52:17.438253 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.437898 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a77bc043-0862-487d-a125-8ad685a25aed-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-bklrp\" (UID: \"a77bc043-0862-487d-a125-8ad685a25aed\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" Apr 21 01:52:17.440908 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.440884 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/48803557-e6ce-4aa0-8b91-f591cb28551b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gptvn\" (UID: \"48803557-e6ce-4aa0-8b91-f591cb28551b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn" Apr 21 01:52:17.441043 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.440884 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a77bc043-0862-487d-a125-8ad685a25aed-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-bklrp\" (UID: \"a77bc043-0862-487d-a125-8ad685a25aed\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" Apr 21 01:52:17.544955 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.544923 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-npfqq" event={"ID":"d46793fb-d7db-4f84-895f-8b7c420d58ab","Type":"ContainerStarted","Data":"00c510003cb6eddea81765754b396b9cc36078bf2fd0cf6d0c2846071377f492"} Apr 21 01:52:17.631598 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.631568 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn" Apr 21 01:52:17.644206 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.644174 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" Apr 21 01:52:17.825357 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.825318 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 01:52:17.833729 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.832461 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:17.835803 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.834915 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 01:52:17.835803 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.835030 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 01:52:17.835803 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.835212 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-dgztd\"" Apr 21 01:52:17.835803 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.835359 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 01:52:17.835803 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.835458 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 01:52:17.835803 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.835486 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 01:52:17.835803 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.835567 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 01:52:17.835803 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.835715 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 01:52:17.836203 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.835884 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 01:52:17.836203 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.836017 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 01:52:17.847069 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.847034 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 01:52:17.848385 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.848351 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn"] Apr 21 01:52:17.872109 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.872083 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-bklrp"] Apr 21 01:52:17.884107 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:52:17.884065 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda77bc043_0862_487d_a125_8ad685a25aed.slice/crio-0f40993685d211556a1e087cb70af16bbc07d7c3319ac3aa224029eb29396945 WatchSource:0}: Error finding container 0f40993685d211556a1e087cb70af16bbc07d7c3319ac3aa224029eb29396945: Status 404 returned error can't find the container with id 0f40993685d211556a1e087cb70af16bbc07d7c3319ac3aa224029eb29396945 Apr 21 01:52:17.945326 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.944558 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-web-config\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:17.945326 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.944603 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d162abce-05f8-4e2b-9911-404d5bc08b63-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:17.945326 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.944674 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d162abce-05f8-4e2b-9911-404d5bc08b63-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:17.945326 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.944719 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:17.945326 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.944781 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d162abce-05f8-4e2b-9911-404d5bc08b63-config-out\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:17.945326 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.944809 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h64w2\" (UniqueName: \"kubernetes.io/projected/d162abce-05f8-4e2b-9911-404d5bc08b63-kube-api-access-h64w2\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:17.945326 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.944865 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:17.945326 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.944918 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:17.945326 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.944945 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-config-volume\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:17.945326 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.944985 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:17.945326 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.945030 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d162abce-05f8-4e2b-9911-404d5bc08b63-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:17.945326 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.945081 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:17.945326 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:17.945120 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d162abce-05f8-4e2b-9911-404d5bc08b63-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.046545 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.046510 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-config-volume\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.046731 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.046570 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.046731 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.046618 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d162abce-05f8-4e2b-9911-404d5bc08b63-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.046731 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.046662 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.046906 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.046783 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d162abce-05f8-4e2b-9911-404d5bc08b63-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.046906 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.046846 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-web-config\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.046906 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.046873 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d162abce-05f8-4e2b-9911-404d5bc08b63-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.046906 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.046903 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d162abce-05f8-4e2b-9911-404d5bc08b63-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.047095 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.046942 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.047095 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.046995 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d162abce-05f8-4e2b-9911-404d5bc08b63-config-out\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.047095 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.047023 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h64w2\" (UniqueName: \"kubernetes.io/projected/d162abce-05f8-4e2b-9911-404d5bc08b63-kube-api-access-h64w2\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.047095 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.047057 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.047286 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.047107 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.047286 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:52:18.047217 2573 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 21 01:52:18.047286 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:52:18.047279 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-secret-alertmanager-main-tls podName:d162abce-05f8-4e2b-9911-404d5bc08b63 nodeName:}" failed. No retries permitted until 2026-04-21 01:52:18.547256883 +0000 UTC m=+136.126558925 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "d162abce-05f8-4e2b-9911-404d5bc08b63") : secret "alertmanager-main-tls" not found Apr 21 01:52:18.048366 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.048336 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d162abce-05f8-4e2b-9911-404d5bc08b63-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.048528 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.048508 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d162abce-05f8-4e2b-9911-404d5bc08b63-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.049391 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.049329 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d162abce-05f8-4e2b-9911-404d5bc08b63-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.050154 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.050128 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.051627 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.051272 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.051627 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.051574 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-config-volume\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.051777 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.051682 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.052045 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.052022 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.052151 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.052096 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d162abce-05f8-4e2b-9911-404d5bc08b63-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.052951 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.052929 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-web-config\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.053223 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.053198 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d162abce-05f8-4e2b-9911-404d5bc08b63-config-out\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.055454 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.055435 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h64w2\" (UniqueName: \"kubernetes.io/projected/d162abce-05f8-4e2b-9911-404d5bc08b63-kube-api-access-h64w2\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.553888 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.553849 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.555181 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.555151 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" event={"ID":"a77bc043-0862-487d-a125-8ad685a25aed","Type":"ContainerStarted","Data":"0f40993685d211556a1e087cb70af16bbc07d7c3319ac3aa224029eb29396945"} Apr 21 01:52:18.557090 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.556846 2573 generic.go:358] "Generic (PLEG): container finished" podID="d46793fb-d7db-4f84-895f-8b7c420d58ab" containerID="64f0b40b6a1422dd33e9af6f975525f8c92c80172cea6ac288bac4a710d50b94" exitCode=0 Apr 21 01:52:18.557090 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.556928 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-npfqq" event={"ID":"d46793fb-d7db-4f84-895f-8b7c420d58ab","Type":"ContainerDied","Data":"64f0b40b6a1422dd33e9af6f975525f8c92c80172cea6ac288bac4a710d50b94"} Apr 21 01:52:18.557090 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.557037 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d162abce-05f8-4e2b-9911-404d5bc08b63-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d162abce-05f8-4e2b-9911-404d5bc08b63\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:18.559501 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.559475 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn" event={"ID":"48803557-e6ce-4aa0-8b91-f591cb28551b","Type":"ContainerStarted","Data":"215ec65a37c6d048f81e2ebc7f4c1b38ae1eb5a4830c43e6591e06d8d4a60f61"} Apr 21 01:52:18.559608 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.559506 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn" event={"ID":"48803557-e6ce-4aa0-8b91-f591cb28551b","Type":"ContainerStarted","Data":"0c8da6ef8589f4a508f424a3eeb8ad7103694abc7269a1de0fb35785aa67ef20"} Apr 21 01:52:18.559608 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.559520 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn" event={"ID":"48803557-e6ce-4aa0-8b91-f591cb28551b","Type":"ContainerStarted","Data":"70744c79460b1ea7404a6785256c94c0e531ece5e4631b1d6d6c2a504263ec1b"} Apr 21 01:52:18.748782 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:18.748744 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 01:52:19.474729 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.474683 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 01:52:19.478328 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:52:19.478292 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd162abce_05f8_4e2b_9911_404d5bc08b63.slice/crio-d87f528fa3f04f3bb8a22e464108345e200f226d958d01f03dc63b132b0cd699 WatchSource:0}: Error finding container d87f528fa3f04f3bb8a22e464108345e200f226d958d01f03dc63b132b0cd699: Status 404 returned error can't find the container with id d87f528fa3f04f3bb8a22e464108345e200f226d958d01f03dc63b132b0cd699 Apr 21 01:52:19.566169 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.566137 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" event={"ID":"a77bc043-0862-487d-a125-8ad685a25aed","Type":"ContainerStarted","Data":"9bf7f7127cad01b5b427a3e106f550b07bbfb5b0912e9d7c69fbdea28e166c34"} Apr 21 01:52:19.566516 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.566177 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" event={"ID":"a77bc043-0862-487d-a125-8ad685a25aed","Type":"ContainerStarted","Data":"d44300e26672e0da6cb096d9f15b2dadf90e5807efc41c6ba702c2a81639d255"} Apr 21 01:52:19.567344 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.567321 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d162abce-05f8-4e2b-9911-404d5bc08b63","Type":"ContainerStarted","Data":"d87f528fa3f04f3bb8a22e464108345e200f226d958d01f03dc63b132b0cd699"} Apr 21 01:52:19.568970 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.568947 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-npfqq" event={"ID":"d46793fb-d7db-4f84-895f-8b7c420d58ab","Type":"ContainerStarted","Data":"85c5a36c5e143815365218c4419184e1953abe9fd380b930279d688f19fd319c"} Apr 21 01:52:19.569072 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.568974 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-npfqq" event={"ID":"d46793fb-d7db-4f84-895f-8b7c420d58ab","Type":"ContainerStarted","Data":"a8f522bda45c37f17f099a4f625acefca82c1fd07dff026f8815a2f2e6a9190f"} Apr 21 01:52:19.571086 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.571061 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn" event={"ID":"48803557-e6ce-4aa0-8b91-f591cb28551b","Type":"ContainerStarted","Data":"31c3fa4323818bb2dcbe17aa988936bdbf6f12b9a25b5525c4aa2e2e0ded806d"} Apr 21 01:52:19.589946 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.588428 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-npfqq" podStartSLOduration=2.917987248 podStartE2EDuration="3.588410534s" podCreationTimestamp="2026-04-21 01:52:16 +0000 UTC" firstStartedPulling="2026-04-21 01:52:17.071739982 +0000 UTC m=+134.651042018" lastFinishedPulling="2026-04-21 01:52:17.742163268 +0000 UTC m=+135.321465304" observedRunningTime="2026-04-21 01:52:19.587907936 +0000 UTC m=+137.167209995" watchObservedRunningTime="2026-04-21 01:52:19.588410534 +0000 UTC m=+137.167712594" Apr 21 01:52:19.605844 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.605769 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gptvn" podStartSLOduration=2.295236804 podStartE2EDuration="3.605750724s" podCreationTimestamp="2026-04-21 01:52:16 +0000 UTC" firstStartedPulling="2026-04-21 01:52:18.020049923 +0000 UTC m=+135.599351959" lastFinishedPulling="2026-04-21 01:52:19.330563831 +0000 UTC m=+136.909865879" observedRunningTime="2026-04-21 01:52:19.604515202 +0000 UTC m=+137.183817262" watchObservedRunningTime="2026-04-21 01:52:19.605750724 +0000 UTC m=+137.185052812" Apr 21 01:52:19.820841 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.820795 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-79669bdb7f-dcqbv"] Apr 21 01:52:19.824995 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.824972 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:19.827387 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.827355 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 21 01:52:19.827732 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.827506 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 21 01:52:19.827732 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.827611 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-jff7c\"" Apr 21 01:52:19.827732 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.827657 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 21 01:52:19.828025 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.828007 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 21 01:52:19.828078 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.828048 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-4ro4d0ndqhrbg\"" Apr 21 01:52:19.828246 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.828026 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 21 01:52:19.835374 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.835343 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-79669bdb7f-dcqbv"] Apr 21 01:52:19.967023 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.966991 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:19.967205 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.967039 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:19.967205 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.967131 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-secret-grpc-tls\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:19.967205 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.967163 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:19.967312 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.967222 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:19.967312 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.967251 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-secret-thanos-querier-tls\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:19.967312 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.967269 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-metrics-client-ca\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:19.967402 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:19.967325 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lkrh\" (UniqueName: \"kubernetes.io/projected/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-kube-api-access-7lkrh\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:20.070531 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:20.070492 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:20.070723 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:20.070582 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-secret-grpc-tls\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:20.070723 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:20.070617 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:20.070883 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:20.070748 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:20.070883 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:20.070798 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-secret-thanos-querier-tls\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:20.070883 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:20.070837 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-metrics-client-ca\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:20.070883 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:20.070876 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lkrh\" (UniqueName: \"kubernetes.io/projected/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-kube-api-access-7lkrh\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:20.071087 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:20.070913 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:20.071837 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:20.071582 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-metrics-client-ca\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:20.074580 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:20.074536 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:20.074709 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:20.074683 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-secret-thanos-querier-tls\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:20.074759 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:20.074710 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:20.074809 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:20.074759 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-secret-grpc-tls\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:20.074872 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:20.074840 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:20.075127 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:20.075106 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:20.083029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:20.083006 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lkrh\" (UniqueName: \"kubernetes.io/projected/f583a61c-5c94-4c0a-bc35-38bbbde0ca46-kube-api-access-7lkrh\") pod \"thanos-querier-79669bdb7f-dcqbv\" (UID: \"f583a61c-5c94-4c0a-bc35-38bbbde0ca46\") " pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:20.136802 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:20.136772 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:20.282042 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:20.281982 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-79669bdb7f-dcqbv"] Apr 21 01:52:20.290892 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:52:20.290861 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf583a61c_5c94_4c0a_bc35_38bbbde0ca46.slice/crio-aaee35d4c5c6ed2321ebe30e1d9781a9564420bb23637b1266566eda5e6d4cec WatchSource:0}: Error finding container aaee35d4c5c6ed2321ebe30e1d9781a9564420bb23637b1266566eda5e6d4cec: Status 404 returned error can't find the container with id aaee35d4c5c6ed2321ebe30e1d9781a9564420bb23637b1266566eda5e6d4cec Apr 21 01:52:20.575918 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:20.575835 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" event={"ID":"a77bc043-0862-487d-a125-8ad685a25aed","Type":"ContainerStarted","Data":"3a991d834d828ccf271d2ffff261b7f1284e9a7f3181fa6164ce527a34487c9a"} Apr 21 01:52:20.576992 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:20.576962 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" event={"ID":"f583a61c-5c94-4c0a-bc35-38bbbde0ca46","Type":"ContainerStarted","Data":"aaee35d4c5c6ed2321ebe30e1d9781a9564420bb23637b1266566eda5e6d4cec"} Apr 21 01:52:20.596226 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:20.596184 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-bklrp" podStartSLOduration=3.149000467 podStartE2EDuration="4.596169476s" podCreationTimestamp="2026-04-21 01:52:16 +0000 UTC" firstStartedPulling="2026-04-21 01:52:17.886525115 +0000 UTC m=+135.465827158" lastFinishedPulling="2026-04-21 01:52:19.333694117 +0000 UTC m=+136.912996167" observedRunningTime="2026-04-21 01:52:20.594093284 +0000 UTC m=+138.173395355" watchObservedRunningTime="2026-04-21 01:52:20.596169476 +0000 UTC m=+138.175471533" Apr 21 01:52:21.126007 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.125956 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-f9b78694b-4qqtk"] Apr 21 01:52:21.129686 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.129659 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.132278 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.132254 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 21 01:52:21.133368 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.133342 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 01:52:21.133477 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.133406 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 21 01:52:21.133581 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.133559 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 21 01:52:21.133687 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.133640 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-sp5nz\"" Apr 21 01:52:21.133751 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.133724 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-f7813bb4d8k7k\"" Apr 21 01:52:21.139645 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.139623 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-f9b78694b-4qqtk"] Apr 21 01:52:21.179612 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.179564 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss7sn\" (UniqueName: \"kubernetes.io/projected/c3078e3f-e828-4443-9fa2-f548a350e468-kube-api-access-ss7sn\") pod \"metrics-server-f9b78694b-4qqtk\" (UID: \"c3078e3f-e828-4443-9fa2-f548a350e468\") " pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.179838 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.179636 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/c3078e3f-e828-4443-9fa2-f548a350e468-secret-metrics-server-client-certs\") pod \"metrics-server-f9b78694b-4qqtk\" (UID: \"c3078e3f-e828-4443-9fa2-f548a350e468\") " pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.179838 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.179717 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3078e3f-e828-4443-9fa2-f548a350e468-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-f9b78694b-4qqtk\" (UID: \"c3078e3f-e828-4443-9fa2-f548a350e468\") " pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.179838 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.179762 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c3078e3f-e828-4443-9fa2-f548a350e468-secret-metrics-server-tls\") pod \"metrics-server-f9b78694b-4qqtk\" (UID: \"c3078e3f-e828-4443-9fa2-f548a350e468\") " pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.179838 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.179793 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c3078e3f-e828-4443-9fa2-f548a350e468-audit-log\") pod \"metrics-server-f9b78694b-4qqtk\" (UID: \"c3078e3f-e828-4443-9fa2-f548a350e468\") " pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.179999 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.179909 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c3078e3f-e828-4443-9fa2-f548a350e468-metrics-server-audit-profiles\") pod \"metrics-server-f9b78694b-4qqtk\" (UID: \"c3078e3f-e828-4443-9fa2-f548a350e468\") " pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.179999 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.179949 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3078e3f-e828-4443-9fa2-f548a350e468-client-ca-bundle\") pod \"metrics-server-f9b78694b-4qqtk\" (UID: \"c3078e3f-e828-4443-9fa2-f548a350e468\") " pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.280753 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.280705 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c3078e3f-e828-4443-9fa2-f548a350e468-metrics-server-audit-profiles\") pod \"metrics-server-f9b78694b-4qqtk\" (UID: \"c3078e3f-e828-4443-9fa2-f548a350e468\") " pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.280947 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.280767 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3078e3f-e828-4443-9fa2-f548a350e468-client-ca-bundle\") pod \"metrics-server-f9b78694b-4qqtk\" (UID: \"c3078e3f-e828-4443-9fa2-f548a350e468\") " pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.280947 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.280835 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ss7sn\" (UniqueName: \"kubernetes.io/projected/c3078e3f-e828-4443-9fa2-f548a350e468-kube-api-access-ss7sn\") pod \"metrics-server-f9b78694b-4qqtk\" (UID: \"c3078e3f-e828-4443-9fa2-f548a350e468\") " pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.280947 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.280899 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/c3078e3f-e828-4443-9fa2-f548a350e468-secret-metrics-server-client-certs\") pod \"metrics-server-f9b78694b-4qqtk\" (UID: \"c3078e3f-e828-4443-9fa2-f548a350e468\") " pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.280947 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.280943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3078e3f-e828-4443-9fa2-f548a350e468-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-f9b78694b-4qqtk\" (UID: \"c3078e3f-e828-4443-9fa2-f548a350e468\") " pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.281167 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.281062 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c3078e3f-e828-4443-9fa2-f548a350e468-secret-metrics-server-tls\") pod \"metrics-server-f9b78694b-4qqtk\" (UID: \"c3078e3f-e828-4443-9fa2-f548a350e468\") " pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.281167 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.281109 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c3078e3f-e828-4443-9fa2-f548a350e468-audit-log\") pod \"metrics-server-f9b78694b-4qqtk\" (UID: \"c3078e3f-e828-4443-9fa2-f548a350e468\") " pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.281726 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.281609 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3078e3f-e828-4443-9fa2-f548a350e468-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-f9b78694b-4qqtk\" (UID: \"c3078e3f-e828-4443-9fa2-f548a350e468\") " pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.281883 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.281849 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c3078e3f-e828-4443-9fa2-f548a350e468-metrics-server-audit-profiles\") pod \"metrics-server-f9b78694b-4qqtk\" (UID: \"c3078e3f-e828-4443-9fa2-f548a350e468\") " pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.281966 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.281892 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c3078e3f-e828-4443-9fa2-f548a350e468-audit-log\") pod \"metrics-server-f9b78694b-4qqtk\" (UID: \"c3078e3f-e828-4443-9fa2-f548a350e468\") " pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.283990 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.283963 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c3078e3f-e828-4443-9fa2-f548a350e468-secret-metrics-server-tls\") pod \"metrics-server-f9b78694b-4qqtk\" (UID: \"c3078e3f-e828-4443-9fa2-f548a350e468\") " pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.284098 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.283997 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/c3078e3f-e828-4443-9fa2-f548a350e468-secret-metrics-server-client-certs\") pod \"metrics-server-f9b78694b-4qqtk\" (UID: \"c3078e3f-e828-4443-9fa2-f548a350e468\") " pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.284098 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.284013 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3078e3f-e828-4443-9fa2-f548a350e468-client-ca-bundle\") pod \"metrics-server-f9b78694b-4qqtk\" (UID: \"c3078e3f-e828-4443-9fa2-f548a350e468\") " pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.288764 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.288743 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss7sn\" (UniqueName: \"kubernetes.io/projected/c3078e3f-e828-4443-9fa2-f548a350e468-kube-api-access-ss7sn\") pod \"metrics-server-f9b78694b-4qqtk\" (UID: \"c3078e3f-e828-4443-9fa2-f548a350e468\") " pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.443060 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.442965 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:21.582196 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.582151 2573 generic.go:358] "Generic (PLEG): container finished" podID="d162abce-05f8-4e2b-9911-404d5bc08b63" containerID="616f854bcff9997aa7968cfa9d7583368bc7f9910f972054c325fff87efb3c7a" exitCode=0 Apr 21 01:52:21.582727 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.582696 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d162abce-05f8-4e2b-9911-404d5bc08b63","Type":"ContainerDied","Data":"616f854bcff9997aa7968cfa9d7583368bc7f9910f972054c325fff87efb3c7a"} Apr 21 01:52:21.591511 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.591488 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-f9b78694b-4qqtk"] Apr 21 01:52:21.918052 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.918022 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-868c6cc5cf-wqxfb"] Apr 21 01:52:21.948676 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.948637 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c4fcddbdb-bbk5k"] Apr 21 01:52:21.953571 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.953545 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:21.959462 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.959435 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c4fcddbdb-bbk5k"] Apr 21 01:52:21.987842 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.987792 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bsmg\" (UniqueName: \"kubernetes.io/projected/22261168-6623-452a-8dca-8ac684c5d859-kube-api-access-9bsmg\") pod \"console-6c4fcddbdb-bbk5k\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:21.988031 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.987927 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-trusted-ca-bundle\") pod \"console-6c4fcddbdb-bbk5k\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:21.988031 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.987961 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-console-config\") pod \"console-6c4fcddbdb-bbk5k\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:21.988031 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.987981 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22261168-6623-452a-8dca-8ac684c5d859-console-oauth-config\") pod \"console-6c4fcddbdb-bbk5k\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:21.988191 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.988039 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22261168-6623-452a-8dca-8ac684c5d859-console-serving-cert\") pod \"console-6c4fcddbdb-bbk5k\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:21.988191 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.988064 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-oauth-serving-cert\") pod \"console-6c4fcddbdb-bbk5k\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:21.988191 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:21.988102 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-service-ca\") pod \"console-6c4fcddbdb-bbk5k\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:22.088970 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.088935 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-console-config\") pod \"console-6c4fcddbdb-bbk5k\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:22.089126 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.088982 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22261168-6623-452a-8dca-8ac684c5d859-console-oauth-config\") pod \"console-6c4fcddbdb-bbk5k\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:22.089126 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.089085 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22261168-6623-452a-8dca-8ac684c5d859-console-serving-cert\") pod \"console-6c4fcddbdb-bbk5k\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:22.089126 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.089115 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-oauth-serving-cert\") pod \"console-6c4fcddbdb-bbk5k\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:22.089266 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.089142 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-service-ca\") pod \"console-6c4fcddbdb-bbk5k\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:22.089266 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.089247 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bsmg\" (UniqueName: \"kubernetes.io/projected/22261168-6623-452a-8dca-8ac684c5d859-kube-api-access-9bsmg\") pod \"console-6c4fcddbdb-bbk5k\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:22.089365 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.089315 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-trusted-ca-bundle\") pod \"console-6c4fcddbdb-bbk5k\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:22.089759 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.089682 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-console-config\") pod \"console-6c4fcddbdb-bbk5k\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:22.089904 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.089788 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-oauth-serving-cert\") pod \"console-6c4fcddbdb-bbk5k\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:22.090294 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.090272 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-service-ca\") pod \"console-6c4fcddbdb-bbk5k\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:22.090446 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.090424 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-trusted-ca-bundle\") pod \"console-6c4fcddbdb-bbk5k\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:22.091640 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.091619 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22261168-6623-452a-8dca-8ac684c5d859-console-oauth-config\") pod \"console-6c4fcddbdb-bbk5k\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:22.091895 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.091878 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22261168-6623-452a-8dca-8ac684c5d859-console-serving-cert\") pod \"console-6c4fcddbdb-bbk5k\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:22.096664 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.096644 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bsmg\" (UniqueName: \"kubernetes.io/projected/22261168-6623-452a-8dca-8ac684c5d859-kube-api-access-9bsmg\") pod \"console-6c4fcddbdb-bbk5k\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:22.102617 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:52:22.102588 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3078e3f_e828_4443_9fa2_f548a350e468.slice/crio-1dd2ca2ef0801f882c94fde6a56190e3056a195b1a14d63208edec4c91129d44 WatchSource:0}: Error finding container 1dd2ca2ef0801f882c94fde6a56190e3056a195b1a14d63208edec4c91129d44: Status 404 returned error can't find the container with id 1dd2ca2ef0801f882c94fde6a56190e3056a195b1a14d63208edec4c91129d44 Apr 21 01:52:22.267660 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.267635 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:22.412544 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.412510 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c4fcddbdb-bbk5k"] Apr 21 01:52:22.414687 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:52:22.414657 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22261168_6623_452a_8dca_8ac684c5d859.slice/crio-b1795b95fae6fc0ac4465182a70a7aa464fb3b9fd431c673186b78d6005562b0 WatchSource:0}: Error finding container b1795b95fae6fc0ac4465182a70a7aa464fb3b9fd431c673186b78d6005562b0: Status 404 returned error can't find the container with id b1795b95fae6fc0ac4465182a70a7aa464fb3b9fd431c673186b78d6005562b0 Apr 21 01:52:22.588883 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.588828 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c4fcddbdb-bbk5k" event={"ID":"22261168-6623-452a-8dca-8ac684c5d859","Type":"ContainerStarted","Data":"b52d14bcde1d1d88c292efe8bdef2b58fa3a9c1fb9b26182399ca2062748869a"} Apr 21 01:52:22.588883 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.588889 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c4fcddbdb-bbk5k" event={"ID":"22261168-6623-452a-8dca-8ac684c5d859","Type":"ContainerStarted","Data":"b1795b95fae6fc0ac4465182a70a7aa464fb3b9fd431c673186b78d6005562b0"} Apr 21 01:52:22.591143 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.591087 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" event={"ID":"f583a61c-5c94-4c0a-bc35-38bbbde0ca46","Type":"ContainerStarted","Data":"e688792684b63c58b8f2c7364ac619d9c410cd6931afd07e828704bf58eb0824"} Apr 21 01:52:22.591143 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.591124 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" event={"ID":"f583a61c-5c94-4c0a-bc35-38bbbde0ca46","Type":"ContainerStarted","Data":"e049a8dd09b7e9b4553f544bdf3a4b3e78823f0c30f9669668b5fa5c63c2eb0f"} Apr 21 01:52:22.591143 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.591142 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" event={"ID":"f583a61c-5c94-4c0a-bc35-38bbbde0ca46","Type":"ContainerStarted","Data":"6adda76fc5d3289d6000360b554c9ec41b11e66760213b2cc6bce8ddd5511321"} Apr 21 01:52:22.594110 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.594037 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" event={"ID":"c3078e3f-e828-4443-9fa2-f548a350e468","Type":"ContainerStarted","Data":"1dd2ca2ef0801f882c94fde6a56190e3056a195b1a14d63208edec4c91129d44"} Apr 21 01:52:22.606082 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:22.605040 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c4fcddbdb-bbk5k" podStartSLOduration=1.6050209130000002 podStartE2EDuration="1.605020913s" podCreationTimestamp="2026-04-21 01:52:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 01:52:22.604387652 +0000 UTC m=+140.183689702" watchObservedRunningTime="2026-04-21 01:52:22.605020913 +0000 UTC m=+140.184322970" Apr 21 01:52:24.208349 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:24.208320 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:24.605223 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:24.605138 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d162abce-05f8-4e2b-9911-404d5bc08b63","Type":"ContainerStarted","Data":"9e1f8d621cacc346f9443613dfb06d74edfb999a570f01be1a78615defad1e5b"} Apr 21 01:52:24.605223 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:24.605180 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d162abce-05f8-4e2b-9911-404d5bc08b63","Type":"ContainerStarted","Data":"3b250cb1b56c96b25b3d090b87e0458eb2d070520f7efa9352c05d4db0ca94d8"} Apr 21 01:52:24.605223 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:24.605195 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d162abce-05f8-4e2b-9911-404d5bc08b63","Type":"ContainerStarted","Data":"19bd8cdcfa991d5a321f26d6946ecdb0841c72473319adf54b609ffefadb27f5"} Apr 21 01:52:24.605223 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:24.605207 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d162abce-05f8-4e2b-9911-404d5bc08b63","Type":"ContainerStarted","Data":"6c8491320c493e16276301b8cf261861b702669eecd2eb8f831ea8cc53de824a"} Apr 21 01:52:24.605223 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:24.605220 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d162abce-05f8-4e2b-9911-404d5bc08b63","Type":"ContainerStarted","Data":"05ded2eedf1ffb651e20a5734cdbd63c711111fb5dcb80f72d18ac8ef80a328f"} Apr 21 01:52:24.605572 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:24.605232 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d162abce-05f8-4e2b-9911-404d5bc08b63","Type":"ContainerStarted","Data":"9ed63446a5b6a28356b1bbcf87e145452ba27af9d2463b73a78bda865e9c47c0"} Apr 21 01:52:24.607885 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:24.607855 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" event={"ID":"f583a61c-5c94-4c0a-bc35-38bbbde0ca46","Type":"ContainerStarted","Data":"e96e1a7651e574507ccde683d7684c225f0b65f3a9c93ae5c0c45c338235f79c"} Apr 21 01:52:24.608001 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:24.607891 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" event={"ID":"f583a61c-5c94-4c0a-bc35-38bbbde0ca46","Type":"ContainerStarted","Data":"2d215252dec4ce397379d43aa07ca5048070d300d69e1c2dfb41351b67c64e8b"} Apr 21 01:52:24.608001 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:24.607904 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" event={"ID":"f583a61c-5c94-4c0a-bc35-38bbbde0ca46","Type":"ContainerStarted","Data":"09011b8c6c3ef392a0cbfdc7ab51edc2ad3c37eef641865c0c29dd6b89deb70e"} Apr 21 01:52:24.608122 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:24.608017 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:24.609337 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:24.609314 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" event={"ID":"c3078e3f-e828-4443-9fa2-f548a350e468","Type":"ContainerStarted","Data":"9988e6f9a2eecb20efb66290791bc9deae5a227970d55f441d55ab447fb16ac3"} Apr 21 01:52:24.633462 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:24.633411 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.161128236 podStartE2EDuration="7.633395717s" podCreationTimestamp="2026-04-21 01:52:17 +0000 UTC" firstStartedPulling="2026-04-21 01:52:19.480950021 +0000 UTC m=+137.060252069" lastFinishedPulling="2026-04-21 01:52:23.95321751 +0000 UTC m=+141.532519550" observedRunningTime="2026-04-21 01:52:24.632404089 +0000 UTC m=+142.211706148" watchObservedRunningTime="2026-04-21 01:52:24.633395717 +0000 UTC m=+142.212697778" Apr 21 01:52:24.653236 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:24.653183 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" podStartSLOduration=1.80289031 podStartE2EDuration="3.653166402s" podCreationTimestamp="2026-04-21 01:52:21 +0000 UTC" firstStartedPulling="2026-04-21 01:52:22.10496525 +0000 UTC m=+139.684267301" lastFinishedPulling="2026-04-21 01:52:23.955241344 +0000 UTC m=+141.534543393" observedRunningTime="2026-04-21 01:52:24.651965115 +0000 UTC m=+142.231267174" watchObservedRunningTime="2026-04-21 01:52:24.653166402 +0000 UTC m=+142.232468460" Apr 21 01:52:24.674027 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:24.673968 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" podStartSLOduration=1.973574175 podStartE2EDuration="5.673946615s" podCreationTimestamp="2026-04-21 01:52:19 +0000 UTC" firstStartedPulling="2026-04-21 01:52:20.292731469 +0000 UTC m=+137.872033505" lastFinishedPulling="2026-04-21 01:52:23.993103907 +0000 UTC m=+141.572405945" observedRunningTime="2026-04-21 01:52:24.672233033 +0000 UTC m=+142.251535092" watchObservedRunningTime="2026-04-21 01:52:24.673946615 +0000 UTC m=+142.253248674" Apr 21 01:52:30.618123 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:30.618095 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-79669bdb7f-dcqbv" Apr 21 01:52:32.268071 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:32.268023 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:32.268071 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:32.268078 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:32.273283 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:32.273259 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:32.636977 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:32.636950 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:52:32.680556 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:32.680525 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6dbfbfc9c4-4gx44"] Apr 21 01:52:38.286738 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:52:38.286692 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-w2qdq" podUID="8598fef8-2cf7-4d82-aa02-44eac46217af" Apr 21 01:52:38.304065 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:52:38.304024 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-r9t8h" podUID="ff1d28c8-cbb1-4385-9c36-da62f691590f" Apr 21 01:52:38.648997 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:38.648968 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w2qdq" Apr 21 01:52:41.443429 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:41.443375 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:41.443429 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:41.443433 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:52:43.177146 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:43.177092 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert\") pod \"ingress-canary-r9t8h\" (UID: \"ff1d28c8-cbb1-4385-9c36-da62f691590f\") " pod="openshift-ingress-canary/ingress-canary-r9t8h" Apr 21 01:52:43.177146 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:43.177152 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls\") pod \"dns-default-w2qdq\" (UID: \"8598fef8-2cf7-4d82-aa02-44eac46217af\") " pod="openshift-dns/dns-default-w2qdq" Apr 21 01:52:43.179571 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:43.179539 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8598fef8-2cf7-4d82-aa02-44eac46217af-metrics-tls\") pod \"dns-default-w2qdq\" (UID: \"8598fef8-2cf7-4d82-aa02-44eac46217af\") " pod="openshift-dns/dns-default-w2qdq" Apr 21 01:52:43.179685 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:43.179619 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff1d28c8-cbb1-4385-9c36-da62f691590f-cert\") pod \"ingress-canary-r9t8h\" (UID: \"ff1d28c8-cbb1-4385-9c36-da62f691590f\") " pod="openshift-ingress-canary/ingress-canary-r9t8h" Apr 21 01:52:43.451746 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:43.451665 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vd9s5\"" Apr 21 01:52:43.459569 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:43.459551 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-w2qdq" Apr 21 01:52:43.579601 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:43.579573 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-w2qdq"] Apr 21 01:52:43.581931 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:52:43.581906 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8598fef8_2cf7_4d82_aa02_44eac46217af.slice/crio-3a3109c3c7656ad6141ed499751533ee29559e068477644fec93315e14ac3917 WatchSource:0}: Error finding container 3a3109c3c7656ad6141ed499751533ee29559e068477644fec93315e14ac3917: Status 404 returned error can't find the container with id 3a3109c3c7656ad6141ed499751533ee29559e068477644fec93315e14ac3917 Apr 21 01:52:43.663877 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:43.663838 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w2qdq" event={"ID":"8598fef8-2cf7-4d82-aa02-44eac46217af","Type":"ContainerStarted","Data":"3a3109c3c7656ad6141ed499751533ee29559e068477644fec93315e14ac3917"} Apr 21 01:52:44.152629 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:44.152596 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-65f9585684-glq78_70601297-057d-42c8-bef0-315d6797ccfd/router/0.log" Apr 21 01:52:45.676550 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:45.676508 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w2qdq" event={"ID":"8598fef8-2cf7-4d82-aa02-44eac46217af","Type":"ContainerStarted","Data":"ded7369a5c011e2ccc31a73082dc3b45b9d1207f8a9067cf2f6518e1679dd84e"} Apr 21 01:52:45.676550 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:45.676555 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-w2qdq" event={"ID":"8598fef8-2cf7-4d82-aa02-44eac46217af","Type":"ContainerStarted","Data":"01a6e7b481cd3e23dfcfe9a46e67b244a7633fc717705eca8772c608be3a3a54"} Apr 21 01:52:45.676997 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:45.676593 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-w2qdq" Apr 21 01:52:45.694314 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:45.694259 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-w2qdq" podStartSLOduration=129.327395602 podStartE2EDuration="2m10.69424344s" podCreationTimestamp="2026-04-21 01:50:35 +0000 UTC" firstStartedPulling="2026-04-21 01:52:43.584054785 +0000 UTC m=+161.163356821" lastFinishedPulling="2026-04-21 01:52:44.95090262 +0000 UTC m=+162.530204659" observedRunningTime="2026-04-21 01:52:45.693264539 +0000 UTC m=+163.272566598" watchObservedRunningTime="2026-04-21 01:52:45.69424344 +0000 UTC m=+163.273545498" Apr 21 01:52:46.944000 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:46.943941 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-868c6cc5cf-wqxfb" podUID="6b734ae0-7dff-4e82-b981-5c23af27b113" containerName="console" containerID="cri-o://3cd371062a3e430258d7d43adbc5b64a7595a00939e68dc68d45290ee332707a" gracePeriod=15 Apr 21 01:52:47.184956 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.184932 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-868c6cc5cf-wqxfb_6b734ae0-7dff-4e82-b981-5c23af27b113/console/0.log" Apr 21 01:52:47.185076 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.184996 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:47.310742 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.310650 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-oauth-serving-cert\") pod \"6b734ae0-7dff-4e82-b981-5c23af27b113\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " Apr 21 01:52:47.310742 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.310729 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-trusted-ca-bundle\") pod \"6b734ae0-7dff-4e82-b981-5c23af27b113\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " Apr 21 01:52:47.310988 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.310755 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b734ae0-7dff-4e82-b981-5c23af27b113-console-serving-cert\") pod \"6b734ae0-7dff-4e82-b981-5c23af27b113\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " Apr 21 01:52:47.310988 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.310798 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b734ae0-7dff-4e82-b981-5c23af27b113-console-oauth-config\") pod \"6b734ae0-7dff-4e82-b981-5c23af27b113\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " Apr 21 01:52:47.310988 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.310852 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-service-ca\") pod \"6b734ae0-7dff-4e82-b981-5c23af27b113\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " Apr 21 01:52:47.310988 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.310880 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtsnq\" (UniqueName: \"kubernetes.io/projected/6b734ae0-7dff-4e82-b981-5c23af27b113-kube-api-access-dtsnq\") pod \"6b734ae0-7dff-4e82-b981-5c23af27b113\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " Apr 21 01:52:47.310988 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.310900 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-console-config\") pod \"6b734ae0-7dff-4e82-b981-5c23af27b113\" (UID: \"6b734ae0-7dff-4e82-b981-5c23af27b113\") " Apr 21 01:52:47.311226 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.310984 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6b734ae0-7dff-4e82-b981-5c23af27b113" (UID: "6b734ae0-7dff-4e82-b981-5c23af27b113"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:52:47.311226 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.311163 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6b734ae0-7dff-4e82-b981-5c23af27b113" (UID: "6b734ae0-7dff-4e82-b981-5c23af27b113"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:52:47.311423 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.311388 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-console-config" (OuterVolumeSpecName: "console-config") pod "6b734ae0-7dff-4e82-b981-5c23af27b113" (UID: "6b734ae0-7dff-4e82-b981-5c23af27b113"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:52:47.311423 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.311415 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-service-ca" (OuterVolumeSpecName: "service-ca") pod "6b734ae0-7dff-4e82-b981-5c23af27b113" (UID: "6b734ae0-7dff-4e82-b981-5c23af27b113"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:52:47.313180 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.313149 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b734ae0-7dff-4e82-b981-5c23af27b113-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6b734ae0-7dff-4e82-b981-5c23af27b113" (UID: "6b734ae0-7dff-4e82-b981-5c23af27b113"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 01:52:47.313287 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.313209 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b734ae0-7dff-4e82-b981-5c23af27b113-kube-api-access-dtsnq" (OuterVolumeSpecName: "kube-api-access-dtsnq") pod "6b734ae0-7dff-4e82-b981-5c23af27b113" (UID: "6b734ae0-7dff-4e82-b981-5c23af27b113"). InnerVolumeSpecName "kube-api-access-dtsnq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:52:47.313287 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.313249 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b734ae0-7dff-4e82-b981-5c23af27b113-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6b734ae0-7dff-4e82-b981-5c23af27b113" (UID: "6b734ae0-7dff-4e82-b981-5c23af27b113"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 01:52:47.411903 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.411861 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-service-ca\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:52:47.411903 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.411895 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dtsnq\" (UniqueName: \"kubernetes.io/projected/6b734ae0-7dff-4e82-b981-5c23af27b113-kube-api-access-dtsnq\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:52:47.411903 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.411905 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-console-config\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:52:47.411903 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.411915 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-oauth-serving-cert\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:52:47.412162 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.411924 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b734ae0-7dff-4e82-b981-5c23af27b113-trusted-ca-bundle\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:52:47.412162 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.411933 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b734ae0-7dff-4e82-b981-5c23af27b113-console-serving-cert\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:52:47.412162 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.411941 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b734ae0-7dff-4e82-b981-5c23af27b113-console-oauth-config\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:52:47.684033 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.684006 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-868c6cc5cf-wqxfb_6b734ae0-7dff-4e82-b981-5c23af27b113/console/0.log" Apr 21 01:52:47.684214 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.684047 2573 generic.go:358] "Generic (PLEG): container finished" podID="6b734ae0-7dff-4e82-b981-5c23af27b113" containerID="3cd371062a3e430258d7d43adbc5b64a7595a00939e68dc68d45290ee332707a" exitCode=2 Apr 21 01:52:47.684214 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.684083 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-868c6cc5cf-wqxfb" event={"ID":"6b734ae0-7dff-4e82-b981-5c23af27b113","Type":"ContainerDied","Data":"3cd371062a3e430258d7d43adbc5b64a7595a00939e68dc68d45290ee332707a"} Apr 21 01:52:47.684214 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.684109 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-868c6cc5cf-wqxfb" event={"ID":"6b734ae0-7dff-4e82-b981-5c23af27b113","Type":"ContainerDied","Data":"72de14efddb0da4cf735941fc49731eeee2647384e5ed6eb9d3a115ebe6cb998"} Apr 21 01:52:47.684214 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.684121 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-868c6cc5cf-wqxfb" Apr 21 01:52:47.684214 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.684143 2573 scope.go:117] "RemoveContainer" containerID="3cd371062a3e430258d7d43adbc5b64a7595a00939e68dc68d45290ee332707a" Apr 21 01:52:47.692595 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.692576 2573 scope.go:117] "RemoveContainer" containerID="3cd371062a3e430258d7d43adbc5b64a7595a00939e68dc68d45290ee332707a" Apr 21 01:52:47.692870 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:52:47.692848 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cd371062a3e430258d7d43adbc5b64a7595a00939e68dc68d45290ee332707a\": container with ID starting with 3cd371062a3e430258d7d43adbc5b64a7595a00939e68dc68d45290ee332707a not found: ID does not exist" containerID="3cd371062a3e430258d7d43adbc5b64a7595a00939e68dc68d45290ee332707a" Apr 21 01:52:47.692955 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.692884 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cd371062a3e430258d7d43adbc5b64a7595a00939e68dc68d45290ee332707a"} err="failed to get container status \"3cd371062a3e430258d7d43adbc5b64a7595a00939e68dc68d45290ee332707a\": rpc error: code = NotFound desc = could not find container \"3cd371062a3e430258d7d43adbc5b64a7595a00939e68dc68d45290ee332707a\": container with ID starting with 3cd371062a3e430258d7d43adbc5b64a7595a00939e68dc68d45290ee332707a not found: ID does not exist" Apr 21 01:52:47.703978 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.703860 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-868c6cc5cf-wqxfb"] Apr 21 01:52:47.706957 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:47.706933 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-868c6cc5cf-wqxfb"] Apr 21 01:52:49.059178 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:49.059097 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r9t8h" Apr 21 01:52:49.061593 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:49.061565 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbqdn\"" Apr 21 01:52:49.063166 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:49.063144 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b734ae0-7dff-4e82-b981-5c23af27b113" path="/var/lib/kubelet/pods/6b734ae0-7dff-4e82-b981-5c23af27b113/volumes" Apr 21 01:52:49.069545 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:49.069522 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r9t8h" Apr 21 01:52:49.189207 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:49.189103 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r9t8h"] Apr 21 01:52:49.191693 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:52:49.191664 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff1d28c8_cbb1_4385_9c36_da62f691590f.slice/crio-d2c01cfe07152287a65d7891833c11a76860722dea632237fde70d1a2d8bdb7d WatchSource:0}: Error finding container d2c01cfe07152287a65d7891833c11a76860722dea632237fde70d1a2d8bdb7d: Status 404 returned error can't find the container with id d2c01cfe07152287a65d7891833c11a76860722dea632237fde70d1a2d8bdb7d Apr 21 01:52:49.692370 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:49.692331 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r9t8h" event={"ID":"ff1d28c8-cbb1-4385-9c36-da62f691590f","Type":"ContainerStarted","Data":"d2c01cfe07152287a65d7891833c11a76860722dea632237fde70d1a2d8bdb7d"} Apr 21 01:52:51.705298 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:51.705243 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r9t8h" event={"ID":"ff1d28c8-cbb1-4385-9c36-da62f691590f","Type":"ContainerStarted","Data":"8301b8e3728ad01da0fbb5cfb121c5d8d65647fc8825c7489171e42ca94c3c07"} Apr 21 01:52:51.720331 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:51.720275 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-r9t8h" podStartSLOduration=134.757256686 podStartE2EDuration="2m16.720259273s" podCreationTimestamp="2026-04-21 01:50:35 +0000 UTC" firstStartedPulling="2026-04-21 01:52:49.193962471 +0000 UTC m=+166.773264520" lastFinishedPulling="2026-04-21 01:52:51.156965068 +0000 UTC m=+168.736267107" observedRunningTime="2026-04-21 01:52:51.718760588 +0000 UTC m=+169.298062647" watchObservedRunningTime="2026-04-21 01:52:51.720259273 +0000 UTC m=+169.299561331" Apr 21 01:52:55.681731 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:55.681700 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-w2qdq" Apr 21 01:52:57.705350 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:57.705289 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6dbfbfc9c4-4gx44" podUID="0aa4d1d4-0b57-47cc-92a7-f6bd34b18794" containerName="console" containerID="cri-o://0071707dc2b740cd875c0542b49744294fbae1fb52d03a1f69e465c786762db8" gracePeriod=15 Apr 21 01:52:57.970891 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:57.970868 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dbfbfc9c4-4gx44_0aa4d1d4-0b57-47cc-92a7-f6bd34b18794/console/0.log" Apr 21 01:52:57.971012 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:57.970928 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:58.117331 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.115554 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sqgb\" (UniqueName: \"kubernetes.io/projected/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-kube-api-access-8sqgb\") pod \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " Apr 21 01:52:58.117331 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.115604 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-console-oauth-config\") pod \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " Apr 21 01:52:58.117331 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.115643 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-oauth-serving-cert\") pod \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " Apr 21 01:52:58.117331 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.115707 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-console-serving-cert\") pod \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " Apr 21 01:52:58.117331 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.115741 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-service-ca\") pod \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " Apr 21 01:52:58.117331 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.115776 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-console-config\") pod \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\" (UID: \"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794\") " Apr 21 01:52:58.117331 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.116453 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-console-config" (OuterVolumeSpecName: "console-config") pod "0aa4d1d4-0b57-47cc-92a7-f6bd34b18794" (UID: "0aa4d1d4-0b57-47cc-92a7-f6bd34b18794"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:52:58.117331 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.117064 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0aa4d1d4-0b57-47cc-92a7-f6bd34b18794" (UID: "0aa4d1d4-0b57-47cc-92a7-f6bd34b18794"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:52:58.119203 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.119174 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-service-ca" (OuterVolumeSpecName: "service-ca") pod "0aa4d1d4-0b57-47cc-92a7-f6bd34b18794" (UID: "0aa4d1d4-0b57-47cc-92a7-f6bd34b18794"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:52:58.126110 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.125938 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-kube-api-access-8sqgb" (OuterVolumeSpecName: "kube-api-access-8sqgb") pod "0aa4d1d4-0b57-47cc-92a7-f6bd34b18794" (UID: "0aa4d1d4-0b57-47cc-92a7-f6bd34b18794"). InnerVolumeSpecName "kube-api-access-8sqgb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:52:58.126110 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.126017 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0aa4d1d4-0b57-47cc-92a7-f6bd34b18794" (UID: "0aa4d1d4-0b57-47cc-92a7-f6bd34b18794"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 01:52:58.126110 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.126071 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0aa4d1d4-0b57-47cc-92a7-f6bd34b18794" (UID: "0aa4d1d4-0b57-47cc-92a7-f6bd34b18794"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 01:52:58.217016 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.216928 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-console-serving-cert\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:52:58.217016 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.216954 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-service-ca\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:52:58.217016 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.216964 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-console-config\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:52:58.217016 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.216973 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8sqgb\" (UniqueName: \"kubernetes.io/projected/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-kube-api-access-8sqgb\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:52:58.217016 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.216982 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-console-oauth-config\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:52:58.217016 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.216991 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794-oauth-serving-cert\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:52:58.728838 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.728790 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dbfbfc9c4-4gx44_0aa4d1d4-0b57-47cc-92a7-f6bd34b18794/console/0.log" Apr 21 01:52:58.729315 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.728853 2573 generic.go:358] "Generic (PLEG): container finished" podID="0aa4d1d4-0b57-47cc-92a7-f6bd34b18794" containerID="0071707dc2b740cd875c0542b49744294fbae1fb52d03a1f69e465c786762db8" exitCode=2 Apr 21 01:52:58.729315 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.728922 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dbfbfc9c4-4gx44" event={"ID":"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794","Type":"ContainerDied","Data":"0071707dc2b740cd875c0542b49744294fbae1fb52d03a1f69e465c786762db8"} Apr 21 01:52:58.729315 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.728928 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dbfbfc9c4-4gx44" Apr 21 01:52:58.729315 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.728950 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dbfbfc9c4-4gx44" event={"ID":"0aa4d1d4-0b57-47cc-92a7-f6bd34b18794","Type":"ContainerDied","Data":"76e5ae957e7cd01ab36ac8dd435c8b3b9cb2e5a7f4e42b43f5be0b626dc392c9"} Apr 21 01:52:58.729315 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.728965 2573 scope.go:117] "RemoveContainer" containerID="0071707dc2b740cd875c0542b49744294fbae1fb52d03a1f69e465c786762db8" Apr 21 01:52:58.738440 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.738421 2573 scope.go:117] "RemoveContainer" containerID="0071707dc2b740cd875c0542b49744294fbae1fb52d03a1f69e465c786762db8" Apr 21 01:52:58.738711 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:52:58.738689 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0071707dc2b740cd875c0542b49744294fbae1fb52d03a1f69e465c786762db8\": container with ID starting with 0071707dc2b740cd875c0542b49744294fbae1fb52d03a1f69e465c786762db8 not found: ID does not exist" containerID="0071707dc2b740cd875c0542b49744294fbae1fb52d03a1f69e465c786762db8" Apr 21 01:52:58.738795 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.738724 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0071707dc2b740cd875c0542b49744294fbae1fb52d03a1f69e465c786762db8"} err="failed to get container status \"0071707dc2b740cd875c0542b49744294fbae1fb52d03a1f69e465c786762db8\": rpc error: code = NotFound desc = could not find container \"0071707dc2b740cd875c0542b49744294fbae1fb52d03a1f69e465c786762db8\": container with ID starting with 0071707dc2b740cd875c0542b49744294fbae1fb52d03a1f69e465c786762db8 not found: ID does not exist" Apr 21 01:52:58.751669 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.751540 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6dbfbfc9c4-4gx44"] Apr 21 01:52:58.754518 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:58.754447 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6dbfbfc9c4-4gx44"] Apr 21 01:52:59.063419 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:52:59.063338 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aa4d1d4-0b57-47cc-92a7-f6bd34b18794" path="/var/lib/kubelet/pods/0aa4d1d4-0b57-47cc-92a7-f6bd34b18794/volumes" Apr 21 01:53:01.448756 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:01.448728 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:53:01.452975 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:01.452952 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-f9b78694b-4qqtk" Apr 21 01:53:40.993250 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:40.993203 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-7b7444b98f-2vtcr"] Apr 21 01:53:40.993908 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:40.993887 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b734ae0-7dff-4e82-b981-5c23af27b113" containerName="console" Apr 21 01:53:40.993908 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:40.993909 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b734ae0-7dff-4e82-b981-5c23af27b113" containerName="console" Apr 21 01:53:40.994067 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:40.993959 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0aa4d1d4-0b57-47cc-92a7-f6bd34b18794" containerName="console" Apr 21 01:53:40.994067 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:40.993969 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aa4d1d4-0b57-47cc-92a7-f6bd34b18794" containerName="console" Apr 21 01:53:40.994067 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:40.994058 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b734ae0-7dff-4e82-b981-5c23af27b113" containerName="console" Apr 21 01:53:40.994210 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:40.994072 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0aa4d1d4-0b57-47cc-92a7-f6bd34b18794" containerName="console" Apr 21 01:53:40.998910 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:40.998871 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.001850 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.001806 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-h79pn\"" Apr 21 01:53:41.002009 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.001806 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 21 01:53:41.002009 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.001967 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 21 01:53:41.002167 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.002148 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 21 01:53:41.002221 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.002204 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 21 01:53:41.002777 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.002760 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 21 01:53:41.007684 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.007652 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 21 01:53:41.008632 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.008606 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7b7444b98f-2vtcr"] Apr 21 01:53:41.030128 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.030083 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/671f9929-3233-4c87-98c8-8cbd3f38929c-metrics-client-ca\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.030472 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.030449 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671f9929-3233-4c87-98c8-8cbd3f38929c-serving-certs-ca-bundle\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.030692 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.030675 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/671f9929-3233-4c87-98c8-8cbd3f38929c-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.030867 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.030843 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8kcl\" (UniqueName: \"kubernetes.io/projected/671f9929-3233-4c87-98c8-8cbd3f38929c-kube-api-access-l8kcl\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.030988 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.030881 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/671f9929-3233-4c87-98c8-8cbd3f38929c-secret-telemeter-client\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.030988 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.030929 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671f9929-3233-4c87-98c8-8cbd3f38929c-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.031073 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.030990 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/671f9929-3233-4c87-98c8-8cbd3f38929c-federate-client-tls\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.031073 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.031017 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/671f9929-3233-4c87-98c8-8cbd3f38929c-telemeter-client-tls\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.131929 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.131894 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671f9929-3233-4c87-98c8-8cbd3f38929c-serving-certs-ca-bundle\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.132163 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.132130 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/671f9929-3233-4c87-98c8-8cbd3f38929c-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.132385 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.132361 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8kcl\" (UniqueName: \"kubernetes.io/projected/671f9929-3233-4c87-98c8-8cbd3f38929c-kube-api-access-l8kcl\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.132481 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.132403 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/671f9929-3233-4c87-98c8-8cbd3f38929c-secret-telemeter-client\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.132481 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.132449 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671f9929-3233-4c87-98c8-8cbd3f38929c-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.132684 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.132661 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/671f9929-3233-4c87-98c8-8cbd3f38929c-federate-client-tls\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.132753 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.132708 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/671f9929-3233-4c87-98c8-8cbd3f38929c-telemeter-client-tls\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.132805 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.132766 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/671f9929-3233-4c87-98c8-8cbd3f38929c-metrics-client-ca\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.132894 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.132858 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671f9929-3233-4c87-98c8-8cbd3f38929c-serving-certs-ca-bundle\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.133285 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.133259 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671f9929-3233-4c87-98c8-8cbd3f38929c-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.133484 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.133462 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/671f9929-3233-4c87-98c8-8cbd3f38929c-metrics-client-ca\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.134710 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.134689 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/671f9929-3233-4c87-98c8-8cbd3f38929c-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.135008 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.134993 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/671f9929-3233-4c87-98c8-8cbd3f38929c-secret-telemeter-client\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.135129 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.135107 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/671f9929-3233-4c87-98c8-8cbd3f38929c-federate-client-tls\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.135252 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.135238 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/671f9929-3233-4c87-98c8-8cbd3f38929c-telemeter-client-tls\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.140064 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.140041 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8kcl\" (UniqueName: \"kubernetes.io/projected/671f9929-3233-4c87-98c8-8cbd3f38929c-kube-api-access-l8kcl\") pod \"telemeter-client-7b7444b98f-2vtcr\" (UID: \"671f9929-3233-4c87-98c8-8cbd3f38929c\") " pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.312248 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.312164 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" Apr 21 01:53:41.438364 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.438336 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7b7444b98f-2vtcr"] Apr 21 01:53:41.440265 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:53:41.440236 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod671f9929_3233_4c87_98c8_8cbd3f38929c.slice/crio-8dd968cf530e5ee5ae955adb39def700bb8c5c1e9d803faab4753f6600a33af0 WatchSource:0}: Error finding container 8dd968cf530e5ee5ae955adb39def700bb8c5c1e9d803faab4753f6600a33af0: Status 404 returned error can't find the container with id 8dd968cf530e5ee5ae955adb39def700bb8c5c1e9d803faab4753f6600a33af0 Apr 21 01:53:41.859056 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:41.859021 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" event={"ID":"671f9929-3233-4c87-98c8-8cbd3f38929c","Type":"ContainerStarted","Data":"8dd968cf530e5ee5ae955adb39def700bb8c5c1e9d803faab4753f6600a33af0"} Apr 21 01:53:43.867384 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:43.867346 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" event={"ID":"671f9929-3233-4c87-98c8-8cbd3f38929c","Type":"ContainerStarted","Data":"1766ce40ae6674c5336f5d294b52360feeeeaf9cabe50755d71ab40b6b3c71cb"} Apr 21 01:53:43.867384 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:43.867383 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" event={"ID":"671f9929-3233-4c87-98c8-8cbd3f38929c","Type":"ContainerStarted","Data":"e1e4c369729c06b5a51fd04edb9d1441bb60c1d5a574f83296f8c34d58bde019"} Apr 21 01:53:43.867384 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:43.867392 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" event={"ID":"671f9929-3233-4c87-98c8-8cbd3f38929c","Type":"ContainerStarted","Data":"4d0f06f5e01981ec824b18f25615c34ab9e01219e7989e668c890e8d1cd17b6f"} Apr 21 01:53:43.889498 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:43.889449 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-7b7444b98f-2vtcr" podStartSLOduration=2.2511326609999998 podStartE2EDuration="3.889433043s" podCreationTimestamp="2026-04-21 01:53:40 +0000 UTC" firstStartedPulling="2026-04-21 01:53:41.442084808 +0000 UTC m=+219.021386844" lastFinishedPulling="2026-04-21 01:53:43.080385186 +0000 UTC m=+220.659687226" observedRunningTime="2026-04-21 01:53:43.886494726 +0000 UTC m=+221.465796784" watchObservedRunningTime="2026-04-21 01:53:43.889433043 +0000 UTC m=+221.468735079" Apr 21 01:53:44.562984 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.562946 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-57f87fb9f9-fcxnl"] Apr 21 01:53:44.566673 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.566635 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.577426 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.577400 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57f87fb9f9-fcxnl"] Apr 21 01:53:44.662125 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.662088 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-service-ca\") pod \"console-57f87fb9f9-fcxnl\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.662125 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.662127 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-oauth-serving-cert\") pod \"console-57f87fb9f9-fcxnl\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.662331 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.662193 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-console-serving-cert\") pod \"console-57f87fb9f9-fcxnl\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.662331 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.662258 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-trusted-ca-bundle\") pod \"console-57f87fb9f9-fcxnl\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.662331 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.662302 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-console-oauth-config\") pod \"console-57f87fb9f9-fcxnl\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.662432 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.662333 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-console-config\") pod \"console-57f87fb9f9-fcxnl\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.662432 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.662350 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gvg8\" (UniqueName: \"kubernetes.io/projected/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-kube-api-access-7gvg8\") pod \"console-57f87fb9f9-fcxnl\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.763532 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.763485 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-service-ca\") pod \"console-57f87fb9f9-fcxnl\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.763532 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.763533 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-oauth-serving-cert\") pod \"console-57f87fb9f9-fcxnl\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.763764 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.763653 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-console-serving-cert\") pod \"console-57f87fb9f9-fcxnl\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.763764 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.763735 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-trusted-ca-bundle\") pod \"console-57f87fb9f9-fcxnl\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.763918 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.763773 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-console-oauth-config\") pod \"console-57f87fb9f9-fcxnl\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.763918 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.763799 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-console-config\") pod \"console-57f87fb9f9-fcxnl\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.763918 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.763847 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gvg8\" (UniqueName: \"kubernetes.io/projected/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-kube-api-access-7gvg8\") pod \"console-57f87fb9f9-fcxnl\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.764325 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.764294 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-oauth-serving-cert\") pod \"console-57f87fb9f9-fcxnl\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.764431 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.764352 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-service-ca\") pod \"console-57f87fb9f9-fcxnl\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.764570 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.764553 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-console-config\") pod \"console-57f87fb9f9-fcxnl\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.764678 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.764657 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-trusted-ca-bundle\") pod \"console-57f87fb9f9-fcxnl\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.766163 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.766136 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-console-serving-cert\") pod \"console-57f87fb9f9-fcxnl\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.766250 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.766179 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-console-oauth-config\") pod \"console-57f87fb9f9-fcxnl\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.771162 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.771145 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gvg8\" (UniqueName: \"kubernetes.io/projected/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-kube-api-access-7gvg8\") pod \"console-57f87fb9f9-fcxnl\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:44.876726 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.876701 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:45.000043 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:44.999957 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57f87fb9f9-fcxnl"] Apr 21 01:53:45.002260 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:53:45.002232 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3818c9e_4bec_4a37_81d6_893c9e3a6a32.slice/crio-2ce3ca076fca5d94cec0a8c57e8a4a9766537d522bfc644fd7e74114681d8d38 WatchSource:0}: Error finding container 2ce3ca076fca5d94cec0a8c57e8a4a9766537d522bfc644fd7e74114681d8d38: Status 404 returned error can't find the container with id 2ce3ca076fca5d94cec0a8c57e8a4a9766537d522bfc644fd7e74114681d8d38 Apr 21 01:53:45.874458 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:45.874420 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57f87fb9f9-fcxnl" event={"ID":"f3818c9e-4bec-4a37-81d6-893c9e3a6a32","Type":"ContainerStarted","Data":"963f3e84c30e67814d0b77a68197f84f96a3c0a536d5c8b96639ea2a306b4446"} Apr 21 01:53:45.874458 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:45.874456 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57f87fb9f9-fcxnl" event={"ID":"f3818c9e-4bec-4a37-81d6-893c9e3a6a32","Type":"ContainerStarted","Data":"2ce3ca076fca5d94cec0a8c57e8a4a9766537d522bfc644fd7e74114681d8d38"} Apr 21 01:53:45.891249 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:45.891200 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-57f87fb9f9-fcxnl" podStartSLOduration=1.891185123 podStartE2EDuration="1.891185123s" podCreationTimestamp="2026-04-21 01:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 01:53:45.889238701 +0000 UTC m=+223.468540759" watchObservedRunningTime="2026-04-21 01:53:45.891185123 +0000 UTC m=+223.470487182" Apr 21 01:53:54.877864 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:54.877758 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:54.877864 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:54.877872 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:54.882610 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:54.882584 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:54.905377 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:54.905346 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:53:54.949892 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:54.949864 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c4fcddbdb-bbk5k"] Apr 21 01:53:55.053855 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.053806 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7d8db5cdb6-rcp5r"] Apr 21 01:53:55.057327 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.057304 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.066144 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.066119 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d8db5cdb6-rcp5r"] Apr 21 01:53:55.153069 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.152963 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grm4k\" (UniqueName: \"kubernetes.io/projected/fd81e18c-aea5-40b0-aff0-eec1085bb796-kube-api-access-grm4k\") pod \"console-7d8db5cdb6-rcp5r\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.153069 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.153010 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-service-ca\") pod \"console-7d8db5cdb6-rcp5r\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.153069 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.153035 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-oauth-serving-cert\") pod \"console-7d8db5cdb6-rcp5r\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.153069 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.153070 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd81e18c-aea5-40b0-aff0-eec1085bb796-console-serving-cert\") pod \"console-7d8db5cdb6-rcp5r\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.153356 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.153096 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-console-config\") pod \"console-7d8db5cdb6-rcp5r\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.153356 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.153157 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fd81e18c-aea5-40b0-aff0-eec1085bb796-console-oauth-config\") pod \"console-7d8db5cdb6-rcp5r\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.153356 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.153211 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-trusted-ca-bundle\") pod \"console-7d8db5cdb6-rcp5r\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.254655 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.254620 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grm4k\" (UniqueName: \"kubernetes.io/projected/fd81e18c-aea5-40b0-aff0-eec1085bb796-kube-api-access-grm4k\") pod \"console-7d8db5cdb6-rcp5r\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.254655 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.254661 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-service-ca\") pod \"console-7d8db5cdb6-rcp5r\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.254941 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.254688 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-oauth-serving-cert\") pod \"console-7d8db5cdb6-rcp5r\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.254941 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.254720 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd81e18c-aea5-40b0-aff0-eec1085bb796-console-serving-cert\") pod \"console-7d8db5cdb6-rcp5r\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.254941 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.254757 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-console-config\") pod \"console-7d8db5cdb6-rcp5r\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.254941 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.254790 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fd81e18c-aea5-40b0-aff0-eec1085bb796-console-oauth-config\") pod \"console-7d8db5cdb6-rcp5r\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.254941 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.254869 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-trusted-ca-bundle\") pod \"console-7d8db5cdb6-rcp5r\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.255558 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.255526 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-service-ca\") pod \"console-7d8db5cdb6-rcp5r\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.255666 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.255564 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-console-config\") pod \"console-7d8db5cdb6-rcp5r\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.255666 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.255567 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-oauth-serving-cert\") pod \"console-7d8db5cdb6-rcp5r\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.255743 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.255663 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-trusted-ca-bundle\") pod \"console-7d8db5cdb6-rcp5r\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.257759 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.257738 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd81e18c-aea5-40b0-aff0-eec1085bb796-console-serving-cert\") pod \"console-7d8db5cdb6-rcp5r\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.257898 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.257877 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fd81e18c-aea5-40b0-aff0-eec1085bb796-console-oauth-config\") pod \"console-7d8db5cdb6-rcp5r\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.265481 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.265451 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grm4k\" (UniqueName: \"kubernetes.io/projected/fd81e18c-aea5-40b0-aff0-eec1085bb796-kube-api-access-grm4k\") pod \"console-7d8db5cdb6-rcp5r\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.368848 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.368762 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:53:55.489952 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.489927 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d8db5cdb6-rcp5r"] Apr 21 01:53:55.491776 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:53:55.491747 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd81e18c_aea5_40b0_aff0_eec1085bb796.slice/crio-7464f4ff54dce42e5b2f5ae767013407aaa781aec2c9e28e7c08020ba1da5d3a WatchSource:0}: Error finding container 7464f4ff54dce42e5b2f5ae767013407aaa781aec2c9e28e7c08020ba1da5d3a: Status 404 returned error can't find the container with id 7464f4ff54dce42e5b2f5ae767013407aaa781aec2c9e28e7c08020ba1da5d3a Apr 21 01:53:55.906502 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.906460 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d8db5cdb6-rcp5r" event={"ID":"fd81e18c-aea5-40b0-aff0-eec1085bb796","Type":"ContainerStarted","Data":"e1e30f6851812fe4c5d0ffea00284a69c3209c3645fa06c1914a02196c44b0df"} Apr 21 01:53:55.906502 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.906502 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d8db5cdb6-rcp5r" event={"ID":"fd81e18c-aea5-40b0-aff0-eec1085bb796","Type":"ContainerStarted","Data":"7464f4ff54dce42e5b2f5ae767013407aaa781aec2c9e28e7c08020ba1da5d3a"} Apr 21 01:53:55.923594 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:53:55.923547 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7d8db5cdb6-rcp5r" podStartSLOduration=0.923531845 podStartE2EDuration="923.531845ms" podCreationTimestamp="2026-04-21 01:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 01:53:55.922993047 +0000 UTC m=+233.502295105" watchObservedRunningTime="2026-04-21 01:53:55.923531845 +0000 UTC m=+233.502833902" Apr 21 01:54:05.369141 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:05.369084 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:54:05.369141 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:05.369149 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:54:05.373973 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:05.373949 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:54:05.944177 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:05.944146 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:54:05.987023 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:05.986984 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57f87fb9f9-fcxnl"] Apr 21 01:54:19.971732 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:19.971657 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c4fcddbdb-bbk5k" podUID="22261168-6623-452a-8dca-8ac684c5d859" containerName="console" containerID="cri-o://b52d14bcde1d1d88c292efe8bdef2b58fa3a9c1fb9b26182399ca2062748869a" gracePeriod=15 Apr 21 01:54:20.212141 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.212116 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c4fcddbdb-bbk5k_22261168-6623-452a-8dca-8ac684c5d859/console/0.log" Apr 21 01:54:20.212278 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.212186 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:54:20.375616 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.375580 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-oauth-serving-cert\") pod \"22261168-6623-452a-8dca-8ac684c5d859\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " Apr 21 01:54:20.375809 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.375667 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-console-config\") pod \"22261168-6623-452a-8dca-8ac684c5d859\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " Apr 21 01:54:20.375920 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.375834 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22261168-6623-452a-8dca-8ac684c5d859-console-oauth-config\") pod \"22261168-6623-452a-8dca-8ac684c5d859\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " Apr 21 01:54:20.375920 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.375881 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-trusted-ca-bundle\") pod \"22261168-6623-452a-8dca-8ac684c5d859\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " Apr 21 01:54:20.375920 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.375901 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-service-ca\") pod \"22261168-6623-452a-8dca-8ac684c5d859\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " Apr 21 01:54:20.376070 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.375927 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bsmg\" (UniqueName: \"kubernetes.io/projected/22261168-6623-452a-8dca-8ac684c5d859-kube-api-access-9bsmg\") pod \"22261168-6623-452a-8dca-8ac684c5d859\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " Apr 21 01:54:20.376070 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.375978 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22261168-6623-452a-8dca-8ac684c5d859-console-serving-cert\") pod \"22261168-6623-452a-8dca-8ac684c5d859\" (UID: \"22261168-6623-452a-8dca-8ac684c5d859\") " Apr 21 01:54:20.376257 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.376214 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "22261168-6623-452a-8dca-8ac684c5d859" (UID: "22261168-6623-452a-8dca-8ac684c5d859"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:54:20.376257 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.376220 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-console-config" (OuterVolumeSpecName: "console-config") pod "22261168-6623-452a-8dca-8ac684c5d859" (UID: "22261168-6623-452a-8dca-8ac684c5d859"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:54:20.376434 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.376277 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "22261168-6623-452a-8dca-8ac684c5d859" (UID: "22261168-6623-452a-8dca-8ac684c5d859"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:54:20.376434 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.376322 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-service-ca" (OuterVolumeSpecName: "service-ca") pod "22261168-6623-452a-8dca-8ac684c5d859" (UID: "22261168-6623-452a-8dca-8ac684c5d859"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:54:20.378082 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.378054 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22261168-6623-452a-8dca-8ac684c5d859-kube-api-access-9bsmg" (OuterVolumeSpecName: "kube-api-access-9bsmg") pod "22261168-6623-452a-8dca-8ac684c5d859" (UID: "22261168-6623-452a-8dca-8ac684c5d859"). InnerVolumeSpecName "kube-api-access-9bsmg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:54:20.378082 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.378077 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22261168-6623-452a-8dca-8ac684c5d859-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "22261168-6623-452a-8dca-8ac684c5d859" (UID: "22261168-6623-452a-8dca-8ac684c5d859"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 01:54:20.378205 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.378186 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22261168-6623-452a-8dca-8ac684c5d859-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "22261168-6623-452a-8dca-8ac684c5d859" (UID: "22261168-6623-452a-8dca-8ac684c5d859"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 01:54:20.477254 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.477215 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-console-config\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:54:20.477254 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.477245 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/22261168-6623-452a-8dca-8ac684c5d859-console-oauth-config\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:54:20.477254 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.477255 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-trusted-ca-bundle\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:54:20.477254 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.477264 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-service-ca\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:54:20.477508 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.477274 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9bsmg\" (UniqueName: \"kubernetes.io/projected/22261168-6623-452a-8dca-8ac684c5d859-kube-api-access-9bsmg\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:54:20.477508 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.477283 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/22261168-6623-452a-8dca-8ac684c5d859-console-serving-cert\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:54:20.477508 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.477293 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/22261168-6623-452a-8dca-8ac684c5d859-oauth-serving-cert\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:54:20.987470 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.987440 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c4fcddbdb-bbk5k_22261168-6623-452a-8dca-8ac684c5d859/console/0.log" Apr 21 01:54:20.987913 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.987480 2573 generic.go:358] "Generic (PLEG): container finished" podID="22261168-6623-452a-8dca-8ac684c5d859" containerID="b52d14bcde1d1d88c292efe8bdef2b58fa3a9c1fb9b26182399ca2062748869a" exitCode=2 Apr 21 01:54:20.987913 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.987519 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c4fcddbdb-bbk5k" event={"ID":"22261168-6623-452a-8dca-8ac684c5d859","Type":"ContainerDied","Data":"b52d14bcde1d1d88c292efe8bdef2b58fa3a9c1fb9b26182399ca2062748869a"} Apr 21 01:54:20.987913 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.987558 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c4fcddbdb-bbk5k" Apr 21 01:54:20.987913 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.987567 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c4fcddbdb-bbk5k" event={"ID":"22261168-6623-452a-8dca-8ac684c5d859","Type":"ContainerDied","Data":"b1795b95fae6fc0ac4465182a70a7aa464fb3b9fd431c673186b78d6005562b0"} Apr 21 01:54:20.987913 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:20.987588 2573 scope.go:117] "RemoveContainer" containerID="b52d14bcde1d1d88c292efe8bdef2b58fa3a9c1fb9b26182399ca2062748869a" Apr 21 01:54:21.001272 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:21.001243 2573 scope.go:117] "RemoveContainer" containerID="b52d14bcde1d1d88c292efe8bdef2b58fa3a9c1fb9b26182399ca2062748869a" Apr 21 01:54:21.001642 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:54:21.001617 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b52d14bcde1d1d88c292efe8bdef2b58fa3a9c1fb9b26182399ca2062748869a\": container with ID starting with b52d14bcde1d1d88c292efe8bdef2b58fa3a9c1fb9b26182399ca2062748869a not found: ID does not exist" containerID="b52d14bcde1d1d88c292efe8bdef2b58fa3a9c1fb9b26182399ca2062748869a" Apr 21 01:54:21.001733 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:21.001650 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b52d14bcde1d1d88c292efe8bdef2b58fa3a9c1fb9b26182399ca2062748869a"} err="failed to get container status \"b52d14bcde1d1d88c292efe8bdef2b58fa3a9c1fb9b26182399ca2062748869a\": rpc error: code = NotFound desc = could not find container \"b52d14bcde1d1d88c292efe8bdef2b58fa3a9c1fb9b26182399ca2062748869a\": container with ID starting with b52d14bcde1d1d88c292efe8bdef2b58fa3a9c1fb9b26182399ca2062748869a not found: ID does not exist" Apr 21 01:54:21.013584 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:21.013549 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c4fcddbdb-bbk5k"] Apr 21 01:54:21.016604 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:21.016581 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c4fcddbdb-bbk5k"] Apr 21 01:54:21.064714 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:21.064677 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22261168-6623-452a-8dca-8ac684c5d859" path="/var/lib/kubelet/pods/22261168-6623-452a-8dca-8ac684c5d859/volumes" Apr 21 01:54:31.008017 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.007947 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-57f87fb9f9-fcxnl" podUID="f3818c9e-4bec-4a37-81d6-893c9e3a6a32" containerName="console" containerID="cri-o://963f3e84c30e67814d0b77a68197f84f96a3c0a536d5c8b96639ea2a306b4446" gracePeriod=15 Apr 21 01:54:31.248508 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.248484 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57f87fb9f9-fcxnl_f3818c9e-4bec-4a37-81d6-893c9e3a6a32/console/0.log" Apr 21 01:54:31.248639 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.248544 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:54:31.267076 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.266997 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-console-serving-cert\") pod \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " Apr 21 01:54:31.267076 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.267045 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-service-ca\") pod \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " Apr 21 01:54:31.267076 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.267063 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gvg8\" (UniqueName: \"kubernetes.io/projected/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-kube-api-access-7gvg8\") pod \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " Apr 21 01:54:31.267316 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.267119 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-trusted-ca-bundle\") pod \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " Apr 21 01:54:31.267316 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.267135 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-console-oauth-config\") pod \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " Apr 21 01:54:31.267316 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.267183 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-oauth-serving-cert\") pod \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " Apr 21 01:54:31.267316 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.267226 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-console-config\") pod \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\" (UID: \"f3818c9e-4bec-4a37-81d6-893c9e3a6a32\") " Apr 21 01:54:31.267550 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.267523 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f3818c9e-4bec-4a37-81d6-893c9e3a6a32" (UID: "f3818c9e-4bec-4a37-81d6-893c9e3a6a32"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:54:31.267804 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.267766 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-console-config" (OuterVolumeSpecName: "console-config") pod "f3818c9e-4bec-4a37-81d6-893c9e3a6a32" (UID: "f3818c9e-4bec-4a37-81d6-893c9e3a6a32"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:54:31.268038 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.267792 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f3818c9e-4bec-4a37-81d6-893c9e3a6a32" (UID: "f3818c9e-4bec-4a37-81d6-893c9e3a6a32"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:54:31.268112 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.268040 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-service-ca" (OuterVolumeSpecName: "service-ca") pod "f3818c9e-4bec-4a37-81d6-893c9e3a6a32" (UID: "f3818c9e-4bec-4a37-81d6-893c9e3a6a32"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:54:31.269934 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.269886 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f3818c9e-4bec-4a37-81d6-893c9e3a6a32" (UID: "f3818c9e-4bec-4a37-81d6-893c9e3a6a32"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 01:54:31.269934 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.269905 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-kube-api-access-7gvg8" (OuterVolumeSpecName: "kube-api-access-7gvg8") pod "f3818c9e-4bec-4a37-81d6-893c9e3a6a32" (UID: "f3818c9e-4bec-4a37-81d6-893c9e3a6a32"). InnerVolumeSpecName "kube-api-access-7gvg8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:54:31.270206 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.270182 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f3818c9e-4bec-4a37-81d6-893c9e3a6a32" (UID: "f3818c9e-4bec-4a37-81d6-893c9e3a6a32"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 01:54:31.369074 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.369036 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-console-config\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:54:31.369074 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.369072 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-console-serving-cert\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:54:31.369074 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.369085 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-service-ca\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:54:31.369385 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.369095 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7gvg8\" (UniqueName: \"kubernetes.io/projected/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-kube-api-access-7gvg8\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:54:31.369385 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.369103 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-trusted-ca-bundle\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:54:31.369385 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.369112 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-console-oauth-config\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:54:31.369385 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:31.369121 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f3818c9e-4bec-4a37-81d6-893c9e3a6a32-oauth-serving-cert\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:54:32.022463 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:32.022436 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57f87fb9f9-fcxnl_f3818c9e-4bec-4a37-81d6-893c9e3a6a32/console/0.log" Apr 21 01:54:32.022863 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:32.022475 2573 generic.go:358] "Generic (PLEG): container finished" podID="f3818c9e-4bec-4a37-81d6-893c9e3a6a32" containerID="963f3e84c30e67814d0b77a68197f84f96a3c0a536d5c8b96639ea2a306b4446" exitCode=2 Apr 21 01:54:32.022863 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:32.022541 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57f87fb9f9-fcxnl" Apr 21 01:54:32.022863 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:32.022560 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57f87fb9f9-fcxnl" event={"ID":"f3818c9e-4bec-4a37-81d6-893c9e3a6a32","Type":"ContainerDied","Data":"963f3e84c30e67814d0b77a68197f84f96a3c0a536d5c8b96639ea2a306b4446"} Apr 21 01:54:32.022863 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:32.022596 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57f87fb9f9-fcxnl" event={"ID":"f3818c9e-4bec-4a37-81d6-893c9e3a6a32","Type":"ContainerDied","Data":"2ce3ca076fca5d94cec0a8c57e8a4a9766537d522bfc644fd7e74114681d8d38"} Apr 21 01:54:32.022863 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:32.022611 2573 scope.go:117] "RemoveContainer" containerID="963f3e84c30e67814d0b77a68197f84f96a3c0a536d5c8b96639ea2a306b4446" Apr 21 01:54:32.031199 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:32.031180 2573 scope.go:117] "RemoveContainer" containerID="963f3e84c30e67814d0b77a68197f84f96a3c0a536d5c8b96639ea2a306b4446" Apr 21 01:54:32.031471 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:54:32.031450 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"963f3e84c30e67814d0b77a68197f84f96a3c0a536d5c8b96639ea2a306b4446\": container with ID starting with 963f3e84c30e67814d0b77a68197f84f96a3c0a536d5c8b96639ea2a306b4446 not found: ID does not exist" containerID="963f3e84c30e67814d0b77a68197f84f96a3c0a536d5c8b96639ea2a306b4446" Apr 21 01:54:32.031510 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:32.031481 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963f3e84c30e67814d0b77a68197f84f96a3c0a536d5c8b96639ea2a306b4446"} err="failed to get container status \"963f3e84c30e67814d0b77a68197f84f96a3c0a536d5c8b96639ea2a306b4446\": rpc error: code = NotFound desc = could not find container \"963f3e84c30e67814d0b77a68197f84f96a3c0a536d5c8b96639ea2a306b4446\": container with ID starting with 963f3e84c30e67814d0b77a68197f84f96a3c0a536d5c8b96639ea2a306b4446 not found: ID does not exist" Apr 21 01:54:32.043008 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:32.042982 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57f87fb9f9-fcxnl"] Apr 21 01:54:32.046730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:32.046708 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-57f87fb9f9-fcxnl"] Apr 21 01:54:33.063105 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:54:33.063075 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3818c9e-4bec-4a37-81d6-893c9e3a6a32" path="/var/lib/kubelet/pods/f3818c9e-4bec-4a37-81d6-893c9e3a6a32/volumes" Apr 21 01:55:02.948574 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:02.948547 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2tdx_ed377958-ce5b-41c7-9512-4b95b799767d/ovn-acl-logging/0.log" Apr 21 01:55:02.949183 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:02.949035 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2tdx_ed377958-ce5b-41c7-9512-4b95b799767d/ovn-acl-logging/0.log" Apr 21 01:55:02.955049 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:02.955023 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 01:55:05.990327 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:05.990291 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d"] Apr 21 01:55:05.992764 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:05.990780 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22261168-6623-452a-8dca-8ac684c5d859" containerName="console" Apr 21 01:55:05.992764 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:05.990796 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="22261168-6623-452a-8dca-8ac684c5d859" containerName="console" Apr 21 01:55:05.992764 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:05.990808 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3818c9e-4bec-4a37-81d6-893c9e3a6a32" containerName="console" Apr 21 01:55:05.992764 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:05.990832 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3818c9e-4bec-4a37-81d6-893c9e3a6a32" containerName="console" Apr 21 01:55:05.992764 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:05.990897 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3818c9e-4bec-4a37-81d6-893c9e3a6a32" containerName="console" Apr 21 01:55:05.992764 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:05.990908 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="22261168-6623-452a-8dca-8ac684c5d859" containerName="console" Apr 21 01:55:05.993797 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:05.993778 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d" Apr 21 01:55:05.996275 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:05.996248 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-gj698\"" Apr 21 01:55:05.996275 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:05.996271 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 01:55:05.997259 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:05.997243 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 01:55:06.001727 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:06.001703 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d"] Apr 21 01:55:06.173106 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:06.173069 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c6eb14e-72ee-4dcc-8a78-ab8f956ee564-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d\" (UID: \"1c6eb14e-72ee-4dcc-8a78-ab8f956ee564\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d" Apr 21 01:55:06.173306 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:06.173128 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzbwh\" (UniqueName: \"kubernetes.io/projected/1c6eb14e-72ee-4dcc-8a78-ab8f956ee564-kube-api-access-tzbwh\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d\" (UID: \"1c6eb14e-72ee-4dcc-8a78-ab8f956ee564\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d" Apr 21 01:55:06.173306 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:06.173231 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c6eb14e-72ee-4dcc-8a78-ab8f956ee564-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d\" (UID: \"1c6eb14e-72ee-4dcc-8a78-ab8f956ee564\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d" Apr 21 01:55:06.274494 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:06.274387 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c6eb14e-72ee-4dcc-8a78-ab8f956ee564-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d\" (UID: \"1c6eb14e-72ee-4dcc-8a78-ab8f956ee564\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d" Apr 21 01:55:06.274494 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:06.274473 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzbwh\" (UniqueName: \"kubernetes.io/projected/1c6eb14e-72ee-4dcc-8a78-ab8f956ee564-kube-api-access-tzbwh\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d\" (UID: \"1c6eb14e-72ee-4dcc-8a78-ab8f956ee564\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d" Apr 21 01:55:06.274713 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:06.274600 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c6eb14e-72ee-4dcc-8a78-ab8f956ee564-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d\" (UID: \"1c6eb14e-72ee-4dcc-8a78-ab8f956ee564\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d" Apr 21 01:55:06.274771 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:06.274752 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c6eb14e-72ee-4dcc-8a78-ab8f956ee564-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d\" (UID: \"1c6eb14e-72ee-4dcc-8a78-ab8f956ee564\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d" Apr 21 01:55:06.274941 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:06.274925 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c6eb14e-72ee-4dcc-8a78-ab8f956ee564-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d\" (UID: \"1c6eb14e-72ee-4dcc-8a78-ab8f956ee564\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d" Apr 21 01:55:06.284239 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:06.284211 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzbwh\" (UniqueName: \"kubernetes.io/projected/1c6eb14e-72ee-4dcc-8a78-ab8f956ee564-kube-api-access-tzbwh\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d\" (UID: \"1c6eb14e-72ee-4dcc-8a78-ab8f956ee564\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d" Apr 21 01:55:06.303917 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:06.303882 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d" Apr 21 01:55:06.425506 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:06.425481 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d"] Apr 21 01:55:06.427617 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:55:06.427589 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c6eb14e_72ee_4dcc_8a78_ab8f956ee564.slice/crio-8ff7b3fbbf835eb483937417b6c903a1c7fa0a2f3dc15c602f7525d1778042a8 WatchSource:0}: Error finding container 8ff7b3fbbf835eb483937417b6c903a1c7fa0a2f3dc15c602f7525d1778042a8: Status 404 returned error can't find the container with id 8ff7b3fbbf835eb483937417b6c903a1c7fa0a2f3dc15c602f7525d1778042a8 Apr 21 01:55:06.429403 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:06.429383 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 01:55:07.124178 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:07.124142 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d" event={"ID":"1c6eb14e-72ee-4dcc-8a78-ab8f956ee564","Type":"ContainerStarted","Data":"8ff7b3fbbf835eb483937417b6c903a1c7fa0a2f3dc15c602f7525d1778042a8"} Apr 21 01:55:12.143046 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:12.143012 2573 generic.go:358] "Generic (PLEG): container finished" podID="1c6eb14e-72ee-4dcc-8a78-ab8f956ee564" containerID="82bf3155dd556c379acda2e3b84b5887e0af6f2e624669d7e0b207cd0d32659e" exitCode=0 Apr 21 01:55:12.143428 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:12.143062 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d" event={"ID":"1c6eb14e-72ee-4dcc-8a78-ab8f956ee564","Type":"ContainerDied","Data":"82bf3155dd556c379acda2e3b84b5887e0af6f2e624669d7e0b207cd0d32659e"} Apr 21 01:55:14.150511 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:14.150480 2573 generic.go:358] "Generic (PLEG): container finished" podID="1c6eb14e-72ee-4dcc-8a78-ab8f956ee564" containerID="b1948f500b4b5d86c3e55350002edd4155dade2309a8d753b48edc6faf700b0a" exitCode=0 Apr 21 01:55:14.151006 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:14.150570 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d" event={"ID":"1c6eb14e-72ee-4dcc-8a78-ab8f956ee564","Type":"ContainerDied","Data":"b1948f500b4b5d86c3e55350002edd4155dade2309a8d753b48edc6faf700b0a"} Apr 21 01:55:20.176137 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:20.176104 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d" event={"ID":"1c6eb14e-72ee-4dcc-8a78-ab8f956ee564","Type":"ContainerStarted","Data":"559eb79e912de1143e646d357c7fa1bb67749d5bf949e26d58b0951b473907f6"} Apr 21 01:55:20.191993 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:20.191943 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d" podStartSLOduration=1.532522674 podStartE2EDuration="15.19192694s" podCreationTimestamp="2026-04-21 01:55:05 +0000 UTC" firstStartedPulling="2026-04-21 01:55:06.429511735 +0000 UTC m=+304.008813771" lastFinishedPulling="2026-04-21 01:55:20.088915982 +0000 UTC m=+317.668218037" observedRunningTime="2026-04-21 01:55:20.190660659 +0000 UTC m=+317.769962718" watchObservedRunningTime="2026-04-21 01:55:20.19192694 +0000 UTC m=+317.771229009" Apr 21 01:55:21.181499 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:21.181451 2573 generic.go:358] "Generic (PLEG): container finished" podID="1c6eb14e-72ee-4dcc-8a78-ab8f956ee564" containerID="559eb79e912de1143e646d357c7fa1bb67749d5bf949e26d58b0951b473907f6" exitCode=0 Apr 21 01:55:21.181956 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:21.181520 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d" event={"ID":"1c6eb14e-72ee-4dcc-8a78-ab8f956ee564","Type":"ContainerDied","Data":"559eb79e912de1143e646d357c7fa1bb67749d5bf949e26d58b0951b473907f6"} Apr 21 01:55:22.302591 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:22.302565 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d" Apr 21 01:55:22.418664 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:22.418624 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c6eb14e-72ee-4dcc-8a78-ab8f956ee564-util\") pod \"1c6eb14e-72ee-4dcc-8a78-ab8f956ee564\" (UID: \"1c6eb14e-72ee-4dcc-8a78-ab8f956ee564\") " Apr 21 01:55:22.418871 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:22.418697 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzbwh\" (UniqueName: \"kubernetes.io/projected/1c6eb14e-72ee-4dcc-8a78-ab8f956ee564-kube-api-access-tzbwh\") pod \"1c6eb14e-72ee-4dcc-8a78-ab8f956ee564\" (UID: \"1c6eb14e-72ee-4dcc-8a78-ab8f956ee564\") " Apr 21 01:55:22.418871 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:22.418731 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c6eb14e-72ee-4dcc-8a78-ab8f956ee564-bundle\") pod \"1c6eb14e-72ee-4dcc-8a78-ab8f956ee564\" (UID: \"1c6eb14e-72ee-4dcc-8a78-ab8f956ee564\") " Apr 21 01:55:22.419442 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:22.419413 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c6eb14e-72ee-4dcc-8a78-ab8f956ee564-bundle" (OuterVolumeSpecName: "bundle") pod "1c6eb14e-72ee-4dcc-8a78-ab8f956ee564" (UID: "1c6eb14e-72ee-4dcc-8a78-ab8f956ee564"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:55:22.421002 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:22.420965 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c6eb14e-72ee-4dcc-8a78-ab8f956ee564-kube-api-access-tzbwh" (OuterVolumeSpecName: "kube-api-access-tzbwh") pod "1c6eb14e-72ee-4dcc-8a78-ab8f956ee564" (UID: "1c6eb14e-72ee-4dcc-8a78-ab8f956ee564"). InnerVolumeSpecName "kube-api-access-tzbwh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:55:22.422854 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:22.422831 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c6eb14e-72ee-4dcc-8a78-ab8f956ee564-util" (OuterVolumeSpecName: "util") pod "1c6eb14e-72ee-4dcc-8a78-ab8f956ee564" (UID: "1c6eb14e-72ee-4dcc-8a78-ab8f956ee564"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:55:22.520188 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:22.520098 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c6eb14e-72ee-4dcc-8a78-ab8f956ee564-util\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:55:22.520188 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:22.520130 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tzbwh\" (UniqueName: \"kubernetes.io/projected/1c6eb14e-72ee-4dcc-8a78-ab8f956ee564-kube-api-access-tzbwh\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:55:22.520188 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:22.520141 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c6eb14e-72ee-4dcc-8a78-ab8f956ee564-bundle\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:55:23.188499 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:23.188471 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d" Apr 21 01:55:23.188644 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:23.188470 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dn9n9d" event={"ID":"1c6eb14e-72ee-4dcc-8a78-ab8f956ee564","Type":"ContainerDied","Data":"8ff7b3fbbf835eb483937417b6c903a1c7fa0a2f3dc15c602f7525d1778042a8"} Apr 21 01:55:23.188644 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:23.188573 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ff7b3fbbf835eb483937417b6c903a1c7fa0a2f3dc15c602f7525d1778042a8" Apr 21 01:55:28.867488 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:28.867455 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wpl6x"] Apr 21 01:55:28.867902 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:28.867804 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c6eb14e-72ee-4dcc-8a78-ab8f956ee564" containerName="extract" Apr 21 01:55:28.867902 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:28.867830 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6eb14e-72ee-4dcc-8a78-ab8f956ee564" containerName="extract" Apr 21 01:55:28.867902 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:28.867843 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c6eb14e-72ee-4dcc-8a78-ab8f956ee564" containerName="pull" Apr 21 01:55:28.867902 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:28.867848 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6eb14e-72ee-4dcc-8a78-ab8f956ee564" containerName="pull" Apr 21 01:55:28.867902 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:28.867873 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c6eb14e-72ee-4dcc-8a78-ab8f956ee564" containerName="util" Apr 21 01:55:28.867902 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:28.867879 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6eb14e-72ee-4dcc-8a78-ab8f956ee564" containerName="util" Apr 21 01:55:28.868093 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:28.867954 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c6eb14e-72ee-4dcc-8a78-ab8f956ee564" containerName="extract" Apr 21 01:55:28.905522 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:28.905484 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wpl6x"] Apr 21 01:55:28.905674 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:28.905635 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wpl6x" Apr 21 01:55:28.909029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:28.909001 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-l82hs\"" Apr 21 01:55:28.909162 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:28.909119 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 21 01:55:28.909162 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:28.909134 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 21 01:55:28.978300 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:28.978258 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55ktl\" (UniqueName: \"kubernetes.io/projected/30546a95-93c7-44e0-8efe-1f06819792e2-kube-api-access-55ktl\") pod \"cert-manager-operator-controller-manager-54b9655956-wpl6x\" (UID: \"30546a95-93c7-44e0-8efe-1f06819792e2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wpl6x" Apr 21 01:55:28.978487 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:28.978330 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/30546a95-93c7-44e0-8efe-1f06819792e2-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-wpl6x\" (UID: \"30546a95-93c7-44e0-8efe-1f06819792e2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wpl6x" Apr 21 01:55:29.079439 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:29.079412 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/30546a95-93c7-44e0-8efe-1f06819792e2-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-wpl6x\" (UID: \"30546a95-93c7-44e0-8efe-1f06819792e2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wpl6x" Apr 21 01:55:29.079585 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:29.079467 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55ktl\" (UniqueName: \"kubernetes.io/projected/30546a95-93c7-44e0-8efe-1f06819792e2-kube-api-access-55ktl\") pod \"cert-manager-operator-controller-manager-54b9655956-wpl6x\" (UID: \"30546a95-93c7-44e0-8efe-1f06819792e2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wpl6x" Apr 21 01:55:29.079801 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:29.079782 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/30546a95-93c7-44e0-8efe-1f06819792e2-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-wpl6x\" (UID: \"30546a95-93c7-44e0-8efe-1f06819792e2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wpl6x" Apr 21 01:55:29.086985 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:29.086962 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55ktl\" (UniqueName: \"kubernetes.io/projected/30546a95-93c7-44e0-8efe-1f06819792e2-kube-api-access-55ktl\") pod \"cert-manager-operator-controller-manager-54b9655956-wpl6x\" (UID: \"30546a95-93c7-44e0-8efe-1f06819792e2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wpl6x" Apr 21 01:55:29.214593 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:29.214560 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wpl6x" Apr 21 01:55:29.340479 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:29.340445 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wpl6x"] Apr 21 01:55:29.354220 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:55:29.354176 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30546a95_93c7_44e0_8efe_1f06819792e2.slice/crio-3ba69a32e80e39e8173ea1f1a6073162743a39507febc0c1140034526770e22f WatchSource:0}: Error finding container 3ba69a32e80e39e8173ea1f1a6073162743a39507febc0c1140034526770e22f: Status 404 returned error can't find the container with id 3ba69a32e80e39e8173ea1f1a6073162743a39507febc0c1140034526770e22f Apr 21 01:55:30.210102 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:30.210061 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wpl6x" event={"ID":"30546a95-93c7-44e0-8efe-1f06819792e2","Type":"ContainerStarted","Data":"3ba69a32e80e39e8173ea1f1a6073162743a39507febc0c1140034526770e22f"} Apr 21 01:55:32.218992 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:32.218960 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wpl6x" event={"ID":"30546a95-93c7-44e0-8efe-1f06819792e2","Type":"ContainerStarted","Data":"6b1308c5f87d52f9d7edc34012f3678c49081b05a5cd8860192fb57a03c53152"} Apr 21 01:55:32.247911 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:32.247838 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-wpl6x" podStartSLOduration=2.029976005 podStartE2EDuration="4.247794247s" podCreationTimestamp="2026-04-21 01:55:28 +0000 UTC" firstStartedPulling="2026-04-21 01:55:29.356690668 +0000 UTC m=+326.935992704" lastFinishedPulling="2026-04-21 01:55:31.574508907 +0000 UTC m=+329.153810946" observedRunningTime="2026-04-21 01:55:32.244605602 +0000 UTC m=+329.823907652" watchObservedRunningTime="2026-04-21 01:55:32.247794247 +0000 UTC m=+329.827096306" Apr 21 01:55:33.212258 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.212220 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw"] Apr 21 01:55:33.237417 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.237385 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw"] Apr 21 01:55:33.237793 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.237653 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw" Apr 21 01:55:33.240551 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.240529 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-gj698\"" Apr 21 01:55:33.240703 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.240605 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 01:55:33.241478 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.241464 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 01:55:33.317666 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.317626 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gggm\" (UniqueName: \"kubernetes.io/projected/b92e0bbc-a17e-4900-a957-ac2c523165b1-kube-api-access-4gggm\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw\" (UID: \"b92e0bbc-a17e-4900-a957-ac2c523165b1\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw" Apr 21 01:55:33.317896 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.317799 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b92e0bbc-a17e-4900-a957-ac2c523165b1-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw\" (UID: \"b92e0bbc-a17e-4900-a957-ac2c523165b1\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw" Apr 21 01:55:33.317896 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.317875 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b92e0bbc-a17e-4900-a957-ac2c523165b1-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw\" (UID: \"b92e0bbc-a17e-4900-a957-ac2c523165b1\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw" Apr 21 01:55:33.418351 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.418311 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b92e0bbc-a17e-4900-a957-ac2c523165b1-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw\" (UID: \"b92e0bbc-a17e-4900-a957-ac2c523165b1\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw" Apr 21 01:55:33.418351 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.418358 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b92e0bbc-a17e-4900-a957-ac2c523165b1-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw\" (UID: \"b92e0bbc-a17e-4900-a957-ac2c523165b1\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw" Apr 21 01:55:33.418558 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.418389 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gggm\" (UniqueName: \"kubernetes.io/projected/b92e0bbc-a17e-4900-a957-ac2c523165b1-kube-api-access-4gggm\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw\" (UID: \"b92e0bbc-a17e-4900-a957-ac2c523165b1\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw" Apr 21 01:55:33.418719 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.418699 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b92e0bbc-a17e-4900-a957-ac2c523165b1-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw\" (UID: \"b92e0bbc-a17e-4900-a957-ac2c523165b1\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw" Apr 21 01:55:33.418764 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.418746 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b92e0bbc-a17e-4900-a957-ac2c523165b1-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw\" (UID: \"b92e0bbc-a17e-4900-a957-ac2c523165b1\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw" Apr 21 01:55:33.426588 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.426562 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gggm\" (UniqueName: \"kubernetes.io/projected/b92e0bbc-a17e-4900-a957-ac2c523165b1-kube-api-access-4gggm\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw\" (UID: \"b92e0bbc-a17e-4900-a957-ac2c523165b1\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw" Apr 21 01:55:33.547168 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.547077 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw" Apr 21 01:55:33.674255 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.674229 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw"] Apr 21 01:55:33.676785 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:55:33.676762 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb92e0bbc_a17e_4900_a957_ac2c523165b1.slice/crio-95c0f3acc7204f8919bc99f448d89da0d3d811c63381e978123af8038acd9269 WatchSource:0}: Error finding container 95c0f3acc7204f8919bc99f448d89da0d3d811c63381e978123af8038acd9269: Status 404 returned error can't find the container with id 95c0f3acc7204f8919bc99f448d89da0d3d811c63381e978123af8038acd9269 Apr 21 01:55:33.861430 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:55:33.861386 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb92e0bbc_a17e_4900_a957_ac2c523165b1.slice/crio-8d587891d09862b5d9dd9ac7f948ecaae1905d65f29762bb43381da880692e22.scope\": RecentStats: unable to find data in memory cache]" Apr 21 01:55:33.933202 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.933169 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-p6v5j"] Apr 21 01:55:33.942912 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.942886 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-p6v5j" Apr 21 01:55:33.945692 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.945662 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 01:55:33.945852 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.945773 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 01:55:33.946543 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:33.946522 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-p6v5j"] Apr 21 01:55:34.023731 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:34.023690 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7rjj\" (UniqueName: \"kubernetes.io/projected/51316da7-6467-4b14-9185-a76f51d0a793-kube-api-access-v7rjj\") pod \"cert-manager-webhook-587ccfb98-p6v5j\" (UID: \"51316da7-6467-4b14-9185-a76f51d0a793\") " pod="cert-manager/cert-manager-webhook-587ccfb98-p6v5j" Apr 21 01:55:34.023731 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:34.023728 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51316da7-6467-4b14-9185-a76f51d0a793-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-p6v5j\" (UID: \"51316da7-6467-4b14-9185-a76f51d0a793\") " pod="cert-manager/cert-manager-webhook-587ccfb98-p6v5j" Apr 21 01:55:34.124767 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:34.124732 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7rjj\" (UniqueName: \"kubernetes.io/projected/51316da7-6467-4b14-9185-a76f51d0a793-kube-api-access-v7rjj\") pod \"cert-manager-webhook-587ccfb98-p6v5j\" (UID: \"51316da7-6467-4b14-9185-a76f51d0a793\") " pod="cert-manager/cert-manager-webhook-587ccfb98-p6v5j" Apr 21 01:55:34.124961 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:34.124776 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51316da7-6467-4b14-9185-a76f51d0a793-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-p6v5j\" (UID: \"51316da7-6467-4b14-9185-a76f51d0a793\") " pod="cert-manager/cert-manager-webhook-587ccfb98-p6v5j" Apr 21 01:55:34.132237 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:34.132209 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51316da7-6467-4b14-9185-a76f51d0a793-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-p6v5j\" (UID: \"51316da7-6467-4b14-9185-a76f51d0a793\") " pod="cert-manager/cert-manager-webhook-587ccfb98-p6v5j" Apr 21 01:55:34.132394 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:34.132345 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7rjj\" (UniqueName: \"kubernetes.io/projected/51316da7-6467-4b14-9185-a76f51d0a793-kube-api-access-v7rjj\") pod \"cert-manager-webhook-587ccfb98-p6v5j\" (UID: \"51316da7-6467-4b14-9185-a76f51d0a793\") " pod="cert-manager/cert-manager-webhook-587ccfb98-p6v5j" Apr 21 01:55:34.226889 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:34.226847 2573 generic.go:358] "Generic (PLEG): container finished" podID="b92e0bbc-a17e-4900-a957-ac2c523165b1" containerID="8d587891d09862b5d9dd9ac7f948ecaae1905d65f29762bb43381da880692e22" exitCode=0 Apr 21 01:55:34.227098 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:34.226891 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw" event={"ID":"b92e0bbc-a17e-4900-a957-ac2c523165b1","Type":"ContainerDied","Data":"8d587891d09862b5d9dd9ac7f948ecaae1905d65f29762bb43381da880692e22"} Apr 21 01:55:34.227098 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:34.226932 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw" event={"ID":"b92e0bbc-a17e-4900-a957-ac2c523165b1","Type":"ContainerStarted","Data":"95c0f3acc7204f8919bc99f448d89da0d3d811c63381e978123af8038acd9269"} Apr 21 01:55:34.268076 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:34.268042 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-p6v5j" Apr 21 01:55:34.390042 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:34.389966 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-p6v5j"] Apr 21 01:55:34.392750 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:55:34.392717 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51316da7_6467_4b14_9185_a76f51d0a793.slice/crio-d8c6939b4e414d55f54525c35062182815cd24520f0eb43ccb3bb9a4e13a18ee WatchSource:0}: Error finding container d8c6939b4e414d55f54525c35062182815cd24520f0eb43ccb3bb9a4e13a18ee: Status 404 returned error can't find the container with id d8c6939b4e414d55f54525c35062182815cd24520f0eb43ccb3bb9a4e13a18ee Apr 21 01:55:35.232183 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:35.232137 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-p6v5j" event={"ID":"51316da7-6467-4b14-9185-a76f51d0a793","Type":"ContainerStarted","Data":"d8c6939b4e414d55f54525c35062182815cd24520f0eb43ccb3bb9a4e13a18ee"} Apr 21 01:55:38.243699 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:38.243658 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-p6v5j" event={"ID":"51316da7-6467-4b14-9185-a76f51d0a793","Type":"ContainerStarted","Data":"bf5073bfe8e14dc83f750e76e72483f1abcfd4ce809c35575debe09b0febe22e"} Apr 21 01:55:38.244187 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:38.243845 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-p6v5j" Apr 21 01:55:38.245280 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:38.245258 2573 generic.go:358] "Generic (PLEG): container finished" podID="b92e0bbc-a17e-4900-a957-ac2c523165b1" containerID="736fb170a129600c7f214241fd877d6275c4441f268105648e49986ce7e5487e" exitCode=0 Apr 21 01:55:38.245377 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:38.245345 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw" event={"ID":"b92e0bbc-a17e-4900-a957-ac2c523165b1","Type":"ContainerDied","Data":"736fb170a129600c7f214241fd877d6275c4441f268105648e49986ce7e5487e"} Apr 21 01:55:38.260226 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:38.260179 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-p6v5j" podStartSLOduration=2.11712776 podStartE2EDuration="5.26016372s" podCreationTimestamp="2026-04-21 01:55:33 +0000 UTC" firstStartedPulling="2026-04-21 01:55:34.394450012 +0000 UTC m=+331.973752048" lastFinishedPulling="2026-04-21 01:55:37.537485961 +0000 UTC m=+335.116788008" observedRunningTime="2026-04-21 01:55:38.258364975 +0000 UTC m=+335.837667034" watchObservedRunningTime="2026-04-21 01:55:38.26016372 +0000 UTC m=+335.839465777" Apr 21 01:55:39.250315 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:39.250284 2573 generic.go:358] "Generic (PLEG): container finished" podID="b92e0bbc-a17e-4900-a957-ac2c523165b1" containerID="18f7bcda7d0e3f83dbaa59f014567ffa2efe45962f402bd701d19312ae88e24e" exitCode=0 Apr 21 01:55:39.250716 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:39.250372 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw" event={"ID":"b92e0bbc-a17e-4900-a957-ac2c523165b1","Type":"ContainerDied","Data":"18f7bcda7d0e3f83dbaa59f014567ffa2efe45962f402bd701d19312ae88e24e"} Apr 21 01:55:40.383160 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:40.383137 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw" Apr 21 01:55:40.484542 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:40.484505 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b92e0bbc-a17e-4900-a957-ac2c523165b1-bundle\") pod \"b92e0bbc-a17e-4900-a957-ac2c523165b1\" (UID: \"b92e0bbc-a17e-4900-a957-ac2c523165b1\") " Apr 21 01:55:40.484717 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:40.484567 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b92e0bbc-a17e-4900-a957-ac2c523165b1-util\") pod \"b92e0bbc-a17e-4900-a957-ac2c523165b1\" (UID: \"b92e0bbc-a17e-4900-a957-ac2c523165b1\") " Apr 21 01:55:40.484717 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:40.484692 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gggm\" (UniqueName: \"kubernetes.io/projected/b92e0bbc-a17e-4900-a957-ac2c523165b1-kube-api-access-4gggm\") pod \"b92e0bbc-a17e-4900-a957-ac2c523165b1\" (UID: \"b92e0bbc-a17e-4900-a957-ac2c523165b1\") " Apr 21 01:55:40.485019 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:40.484996 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b92e0bbc-a17e-4900-a957-ac2c523165b1-bundle" (OuterVolumeSpecName: "bundle") pod "b92e0bbc-a17e-4900-a957-ac2c523165b1" (UID: "b92e0bbc-a17e-4900-a957-ac2c523165b1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:55:40.486808 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:40.486776 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b92e0bbc-a17e-4900-a957-ac2c523165b1-kube-api-access-4gggm" (OuterVolumeSpecName: "kube-api-access-4gggm") pod "b92e0bbc-a17e-4900-a957-ac2c523165b1" (UID: "b92e0bbc-a17e-4900-a957-ac2c523165b1"). InnerVolumeSpecName "kube-api-access-4gggm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:55:40.489242 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:40.489206 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b92e0bbc-a17e-4900-a957-ac2c523165b1-util" (OuterVolumeSpecName: "util") pod "b92e0bbc-a17e-4900-a957-ac2c523165b1" (UID: "b92e0bbc-a17e-4900-a957-ac2c523165b1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:55:40.585696 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:40.585608 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b92e0bbc-a17e-4900-a957-ac2c523165b1-util\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:55:40.585696 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:40.585640 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4gggm\" (UniqueName: \"kubernetes.io/projected/b92e0bbc-a17e-4900-a957-ac2c523165b1-kube-api-access-4gggm\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:55:40.585696 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:40.585650 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b92e0bbc-a17e-4900-a957-ac2c523165b1-bundle\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:55:41.258455 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:41.258418 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw" event={"ID":"b92e0bbc-a17e-4900-a957-ac2c523165b1","Type":"ContainerDied","Data":"95c0f3acc7204f8919bc99f448d89da0d3d811c63381e978123af8038acd9269"} Apr 21 01:55:41.258455 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:41.258455 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95c0f3acc7204f8919bc99f448d89da0d3d811c63381e978123af8038acd9269" Apr 21 01:55:41.258658 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:41.258475 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f4frdw" Apr 21 01:55:44.252495 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:44.252466 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-p6v5j" Apr 21 01:55:44.701150 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:44.701112 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-5kdfb"] Apr 21 01:55:44.701444 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:44.701432 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b92e0bbc-a17e-4900-a957-ac2c523165b1" containerName="util" Apr 21 01:55:44.701499 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:44.701446 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b92e0bbc-a17e-4900-a957-ac2c523165b1" containerName="util" Apr 21 01:55:44.701499 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:44.701466 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b92e0bbc-a17e-4900-a957-ac2c523165b1" containerName="extract" Apr 21 01:55:44.701499 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:44.701472 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b92e0bbc-a17e-4900-a957-ac2c523165b1" containerName="extract" Apr 21 01:55:44.701499 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:44.701479 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b92e0bbc-a17e-4900-a957-ac2c523165b1" containerName="pull" Apr 21 01:55:44.701499 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:44.701483 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b92e0bbc-a17e-4900-a957-ac2c523165b1" containerName="pull" Apr 21 01:55:44.701646 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:44.701532 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b92e0bbc-a17e-4900-a957-ac2c523165b1" containerName="extract" Apr 21 01:55:44.705675 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:44.705652 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-5kdfb" Apr 21 01:55:44.707910 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:44.707886 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-dvwzv\"" Apr 21 01:55:44.711831 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:44.711798 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-5kdfb"] Apr 21 01:55:44.820085 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:44.820048 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9png\" (UniqueName: \"kubernetes.io/projected/469802e8-8f6e-4c5b-803b-f0406636bfb9-kube-api-access-w9png\") pod \"cert-manager-79c8d999ff-5kdfb\" (UID: \"469802e8-8f6e-4c5b-803b-f0406636bfb9\") " pod="cert-manager/cert-manager-79c8d999ff-5kdfb" Apr 21 01:55:44.820263 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:44.820144 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/469802e8-8f6e-4c5b-803b-f0406636bfb9-bound-sa-token\") pod \"cert-manager-79c8d999ff-5kdfb\" (UID: \"469802e8-8f6e-4c5b-803b-f0406636bfb9\") " pod="cert-manager/cert-manager-79c8d999ff-5kdfb" Apr 21 01:55:44.920702 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:44.920659 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/469802e8-8f6e-4c5b-803b-f0406636bfb9-bound-sa-token\") pod \"cert-manager-79c8d999ff-5kdfb\" (UID: \"469802e8-8f6e-4c5b-803b-f0406636bfb9\") " pod="cert-manager/cert-manager-79c8d999ff-5kdfb" Apr 21 01:55:44.920924 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:44.920770 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9png\" (UniqueName: \"kubernetes.io/projected/469802e8-8f6e-4c5b-803b-f0406636bfb9-kube-api-access-w9png\") pod \"cert-manager-79c8d999ff-5kdfb\" (UID: \"469802e8-8f6e-4c5b-803b-f0406636bfb9\") " pod="cert-manager/cert-manager-79c8d999ff-5kdfb" Apr 21 01:55:44.928601 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:44.928563 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/469802e8-8f6e-4c5b-803b-f0406636bfb9-bound-sa-token\") pod \"cert-manager-79c8d999ff-5kdfb\" (UID: \"469802e8-8f6e-4c5b-803b-f0406636bfb9\") " pod="cert-manager/cert-manager-79c8d999ff-5kdfb" Apr 21 01:55:44.928802 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:44.928785 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9png\" (UniqueName: \"kubernetes.io/projected/469802e8-8f6e-4c5b-803b-f0406636bfb9-kube-api-access-w9png\") pod \"cert-manager-79c8d999ff-5kdfb\" (UID: \"469802e8-8f6e-4c5b-803b-f0406636bfb9\") " pod="cert-manager/cert-manager-79c8d999ff-5kdfb" Apr 21 01:55:45.015649 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:45.015549 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-5kdfb" Apr 21 01:55:45.139119 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:45.139095 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-5kdfb"] Apr 21 01:55:45.141895 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:55:45.141866 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod469802e8_8f6e_4c5b_803b_f0406636bfb9.slice/crio-650ea4c76709b34dde1895ca49b12d0fc2e56966aab0230a794add4ba6eed9b8 WatchSource:0}: Error finding container 650ea4c76709b34dde1895ca49b12d0fc2e56966aab0230a794add4ba6eed9b8: Status 404 returned error can't find the container with id 650ea4c76709b34dde1895ca49b12d0fc2e56966aab0230a794add4ba6eed9b8 Apr 21 01:55:45.272932 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:45.272841 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-5kdfb" event={"ID":"469802e8-8f6e-4c5b-803b-f0406636bfb9","Type":"ContainerStarted","Data":"0be6d5b36dfbf879822f2509bb74a0e6f99cc1638071f1d55e53c48f03479c97"} Apr 21 01:55:45.272932 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:45.272882 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-5kdfb" event={"ID":"469802e8-8f6e-4c5b-803b-f0406636bfb9","Type":"ContainerStarted","Data":"650ea4c76709b34dde1895ca49b12d0fc2e56966aab0230a794add4ba6eed9b8"} Apr 21 01:55:45.287213 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:45.287163 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-5kdfb" podStartSLOduration=1.287148848 podStartE2EDuration="1.287148848s" podCreationTimestamp="2026-04-21 01:55:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 01:55:45.285974189 +0000 UTC m=+342.865276248" watchObservedRunningTime="2026-04-21 01:55:45.287148848 +0000 UTC m=+342.866450905" Apr 21 01:55:55.151026 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:55.150940 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww"] Apr 21 01:55:55.154531 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:55.154513 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww" Apr 21 01:55:55.157072 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:55.157049 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-gj698\"" Apr 21 01:55:55.157202 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:55.157048 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 01:55:55.157893 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:55.157880 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 01:55:55.161226 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:55.161199 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww"] Apr 21 01:55:55.318791 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:55.318747 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww\" (UID: \"6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww" Apr 21 01:55:55.319012 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:55.318841 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jtvb\" (UniqueName: \"kubernetes.io/projected/6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558-kube-api-access-6jtvb\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww\" (UID: \"6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww" Apr 21 01:55:55.319012 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:55.318918 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww\" (UID: \"6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww" Apr 21 01:55:55.419672 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:55.419586 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww\" (UID: \"6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww" Apr 21 01:55:55.419672 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:55.419635 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jtvb\" (UniqueName: \"kubernetes.io/projected/6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558-kube-api-access-6jtvb\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww\" (UID: \"6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww" Apr 21 01:55:55.419941 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:55.419694 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww\" (UID: \"6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww" Apr 21 01:55:55.420005 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:55.419968 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww\" (UID: \"6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww" Apr 21 01:55:55.420005 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:55.419988 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww\" (UID: \"6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww" Apr 21 01:55:55.427160 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:55.427129 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jtvb\" (UniqueName: \"kubernetes.io/projected/6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558-kube-api-access-6jtvb\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww\" (UID: \"6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww" Apr 21 01:55:55.466021 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:55.465985 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww" Apr 21 01:55:55.591158 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:55.591129 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww"] Apr 21 01:55:55.593544 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:55:55.593518 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ee8aeb1_8b1f_4b9a_84c7_0fe1250ea558.slice/crio-da1c11b1d00e90fb6c3e14316a2613b676f2d5dc18f576cac48a1026f72f508e WatchSource:0}: Error finding container da1c11b1d00e90fb6c3e14316a2613b676f2d5dc18f576cac48a1026f72f508e: Status 404 returned error can't find the container with id da1c11b1d00e90fb6c3e14316a2613b676f2d5dc18f576cac48a1026f72f508e Apr 21 01:55:56.309541 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:56.309505 2573 generic.go:358] "Generic (PLEG): container finished" podID="6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558" containerID="eee466f51f60f4ac2e9b0be0c126f115ea8920686332e5e3fb713a3fc943a16b" exitCode=0 Apr 21 01:55:56.309967 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:56.309569 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww" event={"ID":"6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558","Type":"ContainerDied","Data":"eee466f51f60f4ac2e9b0be0c126f115ea8920686332e5e3fb713a3fc943a16b"} Apr 21 01:55:56.309967 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:56.309595 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww" event={"ID":"6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558","Type":"ContainerStarted","Data":"da1c11b1d00e90fb6c3e14316a2613b676f2d5dc18f576cac48a1026f72f508e"} Apr 21 01:55:57.314201 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:57.314118 2573 generic.go:358] "Generic (PLEG): container finished" podID="6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558" containerID="7b5edb6075f50d60beb6e56cb01967e717941852d81a7f16dea282adcc80c236" exitCode=0 Apr 21 01:55:57.314201 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:57.314159 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww" event={"ID":"6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558","Type":"ContainerDied","Data":"7b5edb6075f50d60beb6e56cb01967e717941852d81a7f16dea282adcc80c236"} Apr 21 01:55:58.319921 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:58.319890 2573 generic.go:358] "Generic (PLEG): container finished" podID="6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558" containerID="fc0d384d25625d0efe93b10e67e4172eaec7c3b37d5e8f140e395cd0c8af6b2e" exitCode=0 Apr 21 01:55:58.320315 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:58.319963 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww" event={"ID":"6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558","Type":"ContainerDied","Data":"fc0d384d25625d0efe93b10e67e4172eaec7c3b37d5e8f140e395cd0c8af6b2e"} Apr 21 01:55:59.448318 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:59.448292 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww" Apr 21 01:55:59.458034 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:59.458014 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jtvb\" (UniqueName: \"kubernetes.io/projected/6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558-kube-api-access-6jtvb\") pod \"6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558\" (UID: \"6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558\") " Apr 21 01:55:59.458128 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:59.458065 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558-bundle\") pod \"6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558\" (UID: \"6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558\") " Apr 21 01:55:59.458227 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:59.458201 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558-util\") pod \"6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558\" (UID: \"6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558\") " Apr 21 01:55:59.458796 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:59.458774 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558-bundle" (OuterVolumeSpecName: "bundle") pod "6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558" (UID: "6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:55:59.460146 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:59.460124 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558-kube-api-access-6jtvb" (OuterVolumeSpecName: "kube-api-access-6jtvb") pod "6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558" (UID: "6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558"). InnerVolumeSpecName "kube-api-access-6jtvb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:55:59.463898 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:59.463860 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558-util" (OuterVolumeSpecName: "util") pod "6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558" (UID: "6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:55:59.559217 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:59.559182 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6jtvb\" (UniqueName: \"kubernetes.io/projected/6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558-kube-api-access-6jtvb\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:55:59.559217 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:59.559210 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558-bundle\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:55:59.559217 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:55:59.559219 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558-util\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:56:00.328657 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:00.328626 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww" event={"ID":"6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558","Type":"ContainerDied","Data":"da1c11b1d00e90fb6c3e14316a2613b676f2d5dc18f576cac48a1026f72f508e"} Apr 21 01:56:00.328657 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:00.328651 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c575qww" Apr 21 01:56:00.328657 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:00.328658 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da1c11b1d00e90fb6c3e14316a2613b676f2d5dc18f576cac48a1026f72f508e" Apr 21 01:56:08.022797 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.022759 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb"] Apr 21 01:56:08.023347 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.023132 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558" containerName="pull" Apr 21 01:56:08.023347 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.023144 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558" containerName="pull" Apr 21 01:56:08.023347 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.023156 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558" containerName="extract" Apr 21 01:56:08.023347 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.023161 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558" containerName="extract" Apr 21 01:56:08.023347 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.023176 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558" containerName="util" Apr 21 01:56:08.023347 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.023182 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558" containerName="util" Apr 21 01:56:08.023347 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.023236 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ee8aeb1-8b1f-4b9a-84c7-0fe1250ea558" containerName="extract" Apr 21 01:56:08.027543 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.027524 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb" Apr 21 01:56:08.030200 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.030180 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 01:56:08.031174 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.031148 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 01:56:08.031174 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.031145 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 21 01:56:08.031339 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.031166 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-6s4r5\"" Apr 21 01:56:08.031339 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.031195 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 21 01:56:08.031339 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.031260 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 21 01:56:08.035682 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.035664 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb"] Apr 21 01:56:08.135325 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.135280 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m8pl\" (UniqueName: \"kubernetes.io/projected/97eee220-e43b-4185-a8b5-93170f8ceacd-kube-api-access-6m8pl\") pod \"lws-controller-manager-5bf8b8945f-qmpmb\" (UID: \"97eee220-e43b-4185-a8b5-93170f8ceacd\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb" Apr 21 01:56:08.135504 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.135356 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/97eee220-e43b-4185-a8b5-93170f8ceacd-manager-config\") pod \"lws-controller-manager-5bf8b8945f-qmpmb\" (UID: \"97eee220-e43b-4185-a8b5-93170f8ceacd\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb" Apr 21 01:56:08.135595 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.135567 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/97eee220-e43b-4185-a8b5-93170f8ceacd-metrics-cert\") pod \"lws-controller-manager-5bf8b8945f-qmpmb\" (UID: \"97eee220-e43b-4185-a8b5-93170f8ceacd\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb" Apr 21 01:56:08.135650 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.135621 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97eee220-e43b-4185-a8b5-93170f8ceacd-cert\") pod \"lws-controller-manager-5bf8b8945f-qmpmb\" (UID: \"97eee220-e43b-4185-a8b5-93170f8ceacd\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb" Apr 21 01:56:08.236298 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.236265 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97eee220-e43b-4185-a8b5-93170f8ceacd-cert\") pod \"lws-controller-manager-5bf8b8945f-qmpmb\" (UID: \"97eee220-e43b-4185-a8b5-93170f8ceacd\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb" Apr 21 01:56:08.236413 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.236338 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6m8pl\" (UniqueName: \"kubernetes.io/projected/97eee220-e43b-4185-a8b5-93170f8ceacd-kube-api-access-6m8pl\") pod \"lws-controller-manager-5bf8b8945f-qmpmb\" (UID: \"97eee220-e43b-4185-a8b5-93170f8ceacd\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb" Apr 21 01:56:08.236413 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.236374 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/97eee220-e43b-4185-a8b5-93170f8ceacd-manager-config\") pod \"lws-controller-manager-5bf8b8945f-qmpmb\" (UID: \"97eee220-e43b-4185-a8b5-93170f8ceacd\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb" Apr 21 01:56:08.236498 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.236441 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/97eee220-e43b-4185-a8b5-93170f8ceacd-metrics-cert\") pod \"lws-controller-manager-5bf8b8945f-qmpmb\" (UID: \"97eee220-e43b-4185-a8b5-93170f8ceacd\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb" Apr 21 01:56:08.237118 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.237092 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/97eee220-e43b-4185-a8b5-93170f8ceacd-manager-config\") pod \"lws-controller-manager-5bf8b8945f-qmpmb\" (UID: \"97eee220-e43b-4185-a8b5-93170f8ceacd\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb" Apr 21 01:56:08.238914 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.238892 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/97eee220-e43b-4185-a8b5-93170f8ceacd-cert\") pod \"lws-controller-manager-5bf8b8945f-qmpmb\" (UID: \"97eee220-e43b-4185-a8b5-93170f8ceacd\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb" Apr 21 01:56:08.239017 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.238933 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/97eee220-e43b-4185-a8b5-93170f8ceacd-metrics-cert\") pod \"lws-controller-manager-5bf8b8945f-qmpmb\" (UID: \"97eee220-e43b-4185-a8b5-93170f8ceacd\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb" Apr 21 01:56:08.244343 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.244322 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m8pl\" (UniqueName: \"kubernetes.io/projected/97eee220-e43b-4185-a8b5-93170f8ceacd-kube-api-access-6m8pl\") pod \"lws-controller-manager-5bf8b8945f-qmpmb\" (UID: \"97eee220-e43b-4185-a8b5-93170f8ceacd\") " pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb" Apr 21 01:56:08.337733 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.337630 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb" Apr 21 01:56:08.462866 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:08.462838 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb"] Apr 21 01:56:08.465446 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:56:08.465416 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97eee220_e43b_4185_a8b5_93170f8ceacd.slice/crio-74ca109dfb96ae7f85b95f33e49e5380fb1ca7096989ee6d0b5b780a20a764cd WatchSource:0}: Error finding container 74ca109dfb96ae7f85b95f33e49e5380fb1ca7096989ee6d0b5b780a20a764cd: Status 404 returned error can't find the container with id 74ca109dfb96ae7f85b95f33e49e5380fb1ca7096989ee6d0b5b780a20a764cd Apr 21 01:56:09.359648 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:09.359608 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb" event={"ID":"97eee220-e43b-4185-a8b5-93170f8ceacd","Type":"ContainerStarted","Data":"74ca109dfb96ae7f85b95f33e49e5380fb1ca7096989ee6d0b5b780a20a764cd"} Apr 21 01:56:10.365092 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:10.365053 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb" event={"ID":"97eee220-e43b-4185-a8b5-93170f8ceacd","Type":"ContainerStarted","Data":"77ad4c24341f4444d937813f98515663bd06f9f4365d8fdceace3dd072548c5a"} Apr 21 01:56:10.365493 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:10.365151 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb" Apr 21 01:56:10.383543 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:10.383495 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb" podStartSLOduration=0.981719668 podStartE2EDuration="2.38347778s" podCreationTimestamp="2026-04-21 01:56:08 +0000 UTC" firstStartedPulling="2026-04-21 01:56:08.467254758 +0000 UTC m=+366.046556793" lastFinishedPulling="2026-04-21 01:56:09.869012869 +0000 UTC m=+367.448314905" observedRunningTime="2026-04-21 01:56:10.38228107 +0000 UTC m=+367.961583128" watchObservedRunningTime="2026-04-21 01:56:10.38347778 +0000 UTC m=+367.962779838" Apr 21 01:56:11.603675 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:11.603637 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm"] Apr 21 01:56:11.641378 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:11.641343 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm"] Apr 21 01:56:11.641539 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:11.641502 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm" Apr 21 01:56:11.644020 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:11.643979 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 01:56:11.644020 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:11.644002 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 01:56:11.644261 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:11.644082 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-gj698\"" Apr 21 01:56:11.666876 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:11.666834 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb7ba706-b287-4649-adc1-d6b6ba653f30-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm\" (UID: \"fb7ba706-b287-4649-adc1-d6b6ba653f30\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm" Apr 21 01:56:11.667050 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:11.666949 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb7ba706-b287-4649-adc1-d6b6ba653f30-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm\" (UID: \"fb7ba706-b287-4649-adc1-d6b6ba653f30\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm" Apr 21 01:56:11.667050 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:11.667024 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24nsm\" (UniqueName: \"kubernetes.io/projected/fb7ba706-b287-4649-adc1-d6b6ba653f30-kube-api-access-24nsm\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm\" (UID: \"fb7ba706-b287-4649-adc1-d6b6ba653f30\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm" Apr 21 01:56:11.768229 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:11.768188 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb7ba706-b287-4649-adc1-d6b6ba653f30-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm\" (UID: \"fb7ba706-b287-4649-adc1-d6b6ba653f30\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm" Apr 21 01:56:11.768411 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:11.768264 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24nsm\" (UniqueName: \"kubernetes.io/projected/fb7ba706-b287-4649-adc1-d6b6ba653f30-kube-api-access-24nsm\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm\" (UID: \"fb7ba706-b287-4649-adc1-d6b6ba653f30\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm" Apr 21 01:56:11.768411 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:11.768294 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb7ba706-b287-4649-adc1-d6b6ba653f30-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm\" (UID: \"fb7ba706-b287-4649-adc1-d6b6ba653f30\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm" Apr 21 01:56:11.768601 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:11.768580 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb7ba706-b287-4649-adc1-d6b6ba653f30-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm\" (UID: \"fb7ba706-b287-4649-adc1-d6b6ba653f30\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm" Apr 21 01:56:11.768652 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:11.768604 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb7ba706-b287-4649-adc1-d6b6ba653f30-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm\" (UID: \"fb7ba706-b287-4649-adc1-d6b6ba653f30\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm" Apr 21 01:56:11.776109 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:11.776077 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24nsm\" (UniqueName: \"kubernetes.io/projected/fb7ba706-b287-4649-adc1-d6b6ba653f30-kube-api-access-24nsm\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm\" (UID: \"fb7ba706-b287-4649-adc1-d6b6ba653f30\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm" Apr 21 01:56:11.950909 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:11.950863 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm" Apr 21 01:56:12.074190 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.074098 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm"] Apr 21 01:56:12.076433 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:56:12.076405 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb7ba706_b287_4649_adc1_d6b6ba653f30.slice/crio-3a985d510e9e03009cdfdac3e40dbadbcc673e3cc64dbcae0f47e0bb13735e5b WatchSource:0}: Error finding container 3a985d510e9e03009cdfdac3e40dbadbcc673e3cc64dbcae0f47e0bb13735e5b: Status 404 returned error can't find the container with id 3a985d510e9e03009cdfdac3e40dbadbcc673e3cc64dbcae0f47e0bb13735e5b Apr 21 01:56:12.297249 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.297215 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-64bbc69db5-jqbhx"] Apr 21 01:56:12.312171 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.312141 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-jqbhx" Apr 21 01:56:12.316773 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.316748 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-jjk2t\"" Apr 21 01:56:12.316941 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.316740 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 01:56:12.316941 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.316755 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 01:56:12.317177 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.317160 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 01:56:12.320013 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.319993 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 01:56:12.327077 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.327007 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-64bbc69db5-jqbhx"] Apr 21 01:56:12.374642 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.374611 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a421cc19-79ad-41ba-8dfc-971995cc31a0-apiservice-cert\") pod \"opendatahub-operator-controller-manager-64bbc69db5-jqbhx\" (UID: \"a421cc19-79ad-41ba-8dfc-971995cc31a0\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-jqbhx" Apr 21 01:56:12.374894 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.374659 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a421cc19-79ad-41ba-8dfc-971995cc31a0-webhook-cert\") pod \"opendatahub-operator-controller-manager-64bbc69db5-jqbhx\" (UID: \"a421cc19-79ad-41ba-8dfc-971995cc31a0\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-jqbhx" Apr 21 01:56:12.374894 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.374763 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp9wd\" (UniqueName: \"kubernetes.io/projected/a421cc19-79ad-41ba-8dfc-971995cc31a0-kube-api-access-bp9wd\") pod \"opendatahub-operator-controller-manager-64bbc69db5-jqbhx\" (UID: \"a421cc19-79ad-41ba-8dfc-971995cc31a0\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-jqbhx" Apr 21 01:56:12.375619 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.375597 2573 generic.go:358] "Generic (PLEG): container finished" podID="fb7ba706-b287-4649-adc1-d6b6ba653f30" containerID="5c1e31eddcfeb974d1d7a931b3bbffb0d26bcb83793dae3889014c6e0b9ab499" exitCode=0 Apr 21 01:56:12.375713 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.375689 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm" event={"ID":"fb7ba706-b287-4649-adc1-d6b6ba653f30","Type":"ContainerDied","Data":"5c1e31eddcfeb974d1d7a931b3bbffb0d26bcb83793dae3889014c6e0b9ab499"} Apr 21 01:56:12.375751 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.375730 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm" event={"ID":"fb7ba706-b287-4649-adc1-d6b6ba653f30","Type":"ContainerStarted","Data":"3a985d510e9e03009cdfdac3e40dbadbcc673e3cc64dbcae0f47e0bb13735e5b"} Apr 21 01:56:12.475695 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.475664 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a421cc19-79ad-41ba-8dfc-971995cc31a0-apiservice-cert\") pod \"opendatahub-operator-controller-manager-64bbc69db5-jqbhx\" (UID: \"a421cc19-79ad-41ba-8dfc-971995cc31a0\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-jqbhx" Apr 21 01:56:12.475695 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.475709 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a421cc19-79ad-41ba-8dfc-971995cc31a0-webhook-cert\") pod \"opendatahub-operator-controller-manager-64bbc69db5-jqbhx\" (UID: \"a421cc19-79ad-41ba-8dfc-971995cc31a0\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-jqbhx" Apr 21 01:56:12.476003 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.475749 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bp9wd\" (UniqueName: \"kubernetes.io/projected/a421cc19-79ad-41ba-8dfc-971995cc31a0-kube-api-access-bp9wd\") pod \"opendatahub-operator-controller-manager-64bbc69db5-jqbhx\" (UID: \"a421cc19-79ad-41ba-8dfc-971995cc31a0\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-jqbhx" Apr 21 01:56:12.478251 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.478232 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a421cc19-79ad-41ba-8dfc-971995cc31a0-webhook-cert\") pod \"opendatahub-operator-controller-manager-64bbc69db5-jqbhx\" (UID: \"a421cc19-79ad-41ba-8dfc-971995cc31a0\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-jqbhx" Apr 21 01:56:12.478345 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.478255 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a421cc19-79ad-41ba-8dfc-971995cc31a0-apiservice-cert\") pod \"opendatahub-operator-controller-manager-64bbc69db5-jqbhx\" (UID: \"a421cc19-79ad-41ba-8dfc-971995cc31a0\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-jqbhx" Apr 21 01:56:12.487790 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.487740 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp9wd\" (UniqueName: \"kubernetes.io/projected/a421cc19-79ad-41ba-8dfc-971995cc31a0-kube-api-access-bp9wd\") pod \"opendatahub-operator-controller-manager-64bbc69db5-jqbhx\" (UID: \"a421cc19-79ad-41ba-8dfc-971995cc31a0\") " pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-jqbhx" Apr 21 01:56:12.630962 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.630929 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-jqbhx" Apr 21 01:56:12.779460 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:12.779430 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-64bbc69db5-jqbhx"] Apr 21 01:56:12.782143 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:56:12.782112 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda421cc19_79ad_41ba_8dfc_971995cc31a0.slice/crio-062d88216896ac1e08788d04cb1406874086cef3b4c70f66621e29bfb1d7f94f WatchSource:0}: Error finding container 062d88216896ac1e08788d04cb1406874086cef3b4c70f66621e29bfb1d7f94f: Status 404 returned error can't find the container with id 062d88216896ac1e08788d04cb1406874086cef3b4c70f66621e29bfb1d7f94f Apr 21 01:56:13.381981 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:13.381949 2573 generic.go:358] "Generic (PLEG): container finished" podID="fb7ba706-b287-4649-adc1-d6b6ba653f30" containerID="f0c391a9aca29a8f94b764e97d59ac7423a9297b170e56b3a2471f3787261824" exitCode=0 Apr 21 01:56:13.382193 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:13.382037 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm" event={"ID":"fb7ba706-b287-4649-adc1-d6b6ba653f30","Type":"ContainerDied","Data":"f0c391a9aca29a8f94b764e97d59ac7423a9297b170e56b3a2471f3787261824"} Apr 21 01:56:13.383477 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:13.383453 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-jqbhx" event={"ID":"a421cc19-79ad-41ba-8dfc-971995cc31a0","Type":"ContainerStarted","Data":"062d88216896ac1e08788d04cb1406874086cef3b4c70f66621e29bfb1d7f94f"} Apr 21 01:56:14.390627 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:14.390548 2573 generic.go:358] "Generic (PLEG): container finished" podID="fb7ba706-b287-4649-adc1-d6b6ba653f30" containerID="43113bd0a55477b61192b445207c86c7cfb221a41967b9ee620e2a75fb85d18e" exitCode=0 Apr 21 01:56:14.390627 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:14.390621 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm" event={"ID":"fb7ba706-b287-4649-adc1-d6b6ba653f30","Type":"ContainerDied","Data":"43113bd0a55477b61192b445207c86c7cfb221a41967b9ee620e2a75fb85d18e"} Apr 21 01:56:15.399035 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:15.398984 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-jqbhx" event={"ID":"a421cc19-79ad-41ba-8dfc-971995cc31a0","Type":"ContainerStarted","Data":"a7f86b9de7ea1c164db57d35f219691635e6620b7f647197535339ced6977cb0"} Apr 21 01:56:15.399439 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:15.399059 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-jqbhx" Apr 21 01:56:15.423628 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:15.423565 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-jqbhx" podStartSLOduration=0.912938453 podStartE2EDuration="3.423548155s" podCreationTimestamp="2026-04-21 01:56:12 +0000 UTC" firstStartedPulling="2026-04-21 01:56:12.783996481 +0000 UTC m=+370.363298521" lastFinishedPulling="2026-04-21 01:56:15.294606177 +0000 UTC m=+372.873908223" observedRunningTime="2026-04-21 01:56:15.421605647 +0000 UTC m=+373.000907706" watchObservedRunningTime="2026-04-21 01:56:15.423548155 +0000 UTC m=+373.002850214" Apr 21 01:56:15.525108 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:15.525085 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm" Apr 21 01:56:15.608110 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:15.608019 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb7ba706-b287-4649-adc1-d6b6ba653f30-bundle\") pod \"fb7ba706-b287-4649-adc1-d6b6ba653f30\" (UID: \"fb7ba706-b287-4649-adc1-d6b6ba653f30\") " Apr 21 01:56:15.608110 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:15.608072 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24nsm\" (UniqueName: \"kubernetes.io/projected/fb7ba706-b287-4649-adc1-d6b6ba653f30-kube-api-access-24nsm\") pod \"fb7ba706-b287-4649-adc1-d6b6ba653f30\" (UID: \"fb7ba706-b287-4649-adc1-d6b6ba653f30\") " Apr 21 01:56:15.608300 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:15.608146 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb7ba706-b287-4649-adc1-d6b6ba653f30-util\") pod \"fb7ba706-b287-4649-adc1-d6b6ba653f30\" (UID: \"fb7ba706-b287-4649-adc1-d6b6ba653f30\") " Apr 21 01:56:15.608906 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:15.608866 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb7ba706-b287-4649-adc1-d6b6ba653f30-bundle" (OuterVolumeSpecName: "bundle") pod "fb7ba706-b287-4649-adc1-d6b6ba653f30" (UID: "fb7ba706-b287-4649-adc1-d6b6ba653f30"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:56:15.610284 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:15.610254 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb7ba706-b287-4649-adc1-d6b6ba653f30-kube-api-access-24nsm" (OuterVolumeSpecName: "kube-api-access-24nsm") pod "fb7ba706-b287-4649-adc1-d6b6ba653f30" (UID: "fb7ba706-b287-4649-adc1-d6b6ba653f30"). InnerVolumeSpecName "kube-api-access-24nsm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:56:15.613001 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:15.612978 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb7ba706-b287-4649-adc1-d6b6ba653f30-util" (OuterVolumeSpecName: "util") pod "fb7ba706-b287-4649-adc1-d6b6ba653f30" (UID: "fb7ba706-b287-4649-adc1-d6b6ba653f30"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:56:15.709610 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:15.709567 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb7ba706-b287-4649-adc1-d6b6ba653f30-bundle\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:56:15.709610 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:15.709599 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-24nsm\" (UniqueName: \"kubernetes.io/projected/fb7ba706-b287-4649-adc1-d6b6ba653f30-kube-api-access-24nsm\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:56:15.709610 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:15.709609 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb7ba706-b287-4649-adc1-d6b6ba653f30-util\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:56:16.404096 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:16.404060 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm" event={"ID":"fb7ba706-b287-4649-adc1-d6b6ba653f30","Type":"ContainerDied","Data":"3a985d510e9e03009cdfdac3e40dbadbcc673e3cc64dbcae0f47e0bb13735e5b"} Apr 21 01:56:16.404096 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:16.404096 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a985d510e9e03009cdfdac3e40dbadbcc673e3cc64dbcae0f47e0bb13735e5b" Apr 21 01:56:16.404649 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:16.404327 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9md7vm" Apr 21 01:56:21.374365 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:21.374333 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5bf8b8945f-qmpmb" Apr 21 01:56:26.406106 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:26.406075 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-64bbc69db5-jqbhx" Apr 21 01:56:29.978516 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:29.978479 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6"] Apr 21 01:56:29.979023 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:29.979006 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb7ba706-b287-4649-adc1-d6b6ba653f30" containerName="extract" Apr 21 01:56:29.979085 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:29.979027 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb7ba706-b287-4649-adc1-d6b6ba653f30" containerName="extract" Apr 21 01:56:29.979085 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:29.979059 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb7ba706-b287-4649-adc1-d6b6ba653f30" containerName="util" Apr 21 01:56:29.979085 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:29.979068 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb7ba706-b287-4649-adc1-d6b6ba653f30" containerName="util" Apr 21 01:56:29.979183 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:29.979090 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb7ba706-b287-4649-adc1-d6b6ba653f30" containerName="pull" Apr 21 01:56:29.979183 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:29.979099 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb7ba706-b287-4649-adc1-d6b6ba653f30" containerName="pull" Apr 21 01:56:29.979243 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:29.979189 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb7ba706-b287-4649-adc1-d6b6ba653f30" containerName="extract" Apr 21 01:56:29.983103 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:29.983078 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6" Apr 21 01:56:29.985645 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:29.985622 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 01:56:29.985787 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:29.985691 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-gj698\"" Apr 21 01:56:29.987044 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:29.987017 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 01:56:29.990465 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:29.990437 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6"] Apr 21 01:56:30.139745 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.139705 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ba69c9c-9eb3-44b6-ac88-578f8c4b7329-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6\" (UID: \"5ba69c9c-9eb3-44b6-ac88-578f8c4b7329\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6" Apr 21 01:56:30.139967 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.139768 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ba69c9c-9eb3-44b6-ac88-578f8c4b7329-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6\" (UID: \"5ba69c9c-9eb3-44b6-ac88-578f8c4b7329\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6" Apr 21 01:56:30.139967 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.139861 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5n7d\" (UniqueName: \"kubernetes.io/projected/5ba69c9c-9eb3-44b6-ac88-578f8c4b7329-kube-api-access-g5n7d\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6\" (UID: \"5ba69c9c-9eb3-44b6-ac88-578f8c4b7329\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6" Apr 21 01:56:30.241331 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.241242 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ba69c9c-9eb3-44b6-ac88-578f8c4b7329-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6\" (UID: \"5ba69c9c-9eb3-44b6-ac88-578f8c4b7329\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6" Apr 21 01:56:30.241331 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.241289 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5n7d\" (UniqueName: \"kubernetes.io/projected/5ba69c9c-9eb3-44b6-ac88-578f8c4b7329-kube-api-access-g5n7d\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6\" (UID: \"5ba69c9c-9eb3-44b6-ac88-578f8c4b7329\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6" Apr 21 01:56:30.241560 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.241411 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ba69c9c-9eb3-44b6-ac88-578f8c4b7329-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6\" (UID: \"5ba69c9c-9eb3-44b6-ac88-578f8c4b7329\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6" Apr 21 01:56:30.241700 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.241681 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ba69c9c-9eb3-44b6-ac88-578f8c4b7329-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6\" (UID: \"5ba69c9c-9eb3-44b6-ac88-578f8c4b7329\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6" Apr 21 01:56:30.241771 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.241751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ba69c9c-9eb3-44b6-ac88-578f8c4b7329-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6\" (UID: \"5ba69c9c-9eb3-44b6-ac88-578f8c4b7329\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6" Apr 21 01:56:30.252931 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.252903 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5n7d\" (UniqueName: \"kubernetes.io/projected/5ba69c9c-9eb3-44b6-ac88-578f8c4b7329-kube-api-access-g5n7d\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6\" (UID: \"5ba69c9c-9eb3-44b6-ac88-578f8c4b7329\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6" Apr 21 01:56:30.294907 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.294865 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6" Apr 21 01:56:30.424236 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.424211 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6"] Apr 21 01:56:30.427038 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:56:30.427006 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ba69c9c_9eb3_44b6_ac88_578f8c4b7329.slice/crio-0e487180832c36181fb26aaa53a256a1fef4ac404e277d1bc5d6395f116dc9d3 WatchSource:0}: Error finding container 0e487180832c36181fb26aaa53a256a1fef4ac404e277d1bc5d6395f116dc9d3: Status 404 returned error can't find the container with id 0e487180832c36181fb26aaa53a256a1fef4ac404e277d1bc5d6395f116dc9d3 Apr 21 01:56:30.455856 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.455809 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6" event={"ID":"5ba69c9c-9eb3-44b6-ac88-578f8c4b7329","Type":"ContainerStarted","Data":"0e487180832c36181fb26aaa53a256a1fef4ac404e277d1bc5d6395f116dc9d3"} Apr 21 01:56:30.597433 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.597398 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-56c874cd68-dzm28"] Apr 21 01:56:30.600947 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.600930 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-56c874cd68-dzm28" Apr 21 01:56:30.603465 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.603437 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 21 01:56:30.603626 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.603437 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 21 01:56:30.603626 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.603438 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-8qshb\"" Apr 21 01:56:30.609235 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.609204 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-56c874cd68-dzm28"] Apr 21 01:56:30.746216 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.746175 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/530dc938-f98c-4492-a65a-20e3a9d4750c-tmp\") pod \"kube-auth-proxy-56c874cd68-dzm28\" (UID: \"530dc938-f98c-4492-a65a-20e3a9d4750c\") " pod="openshift-ingress/kube-auth-proxy-56c874cd68-dzm28" Apr 21 01:56:30.746386 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.746310 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq8mw\" (UniqueName: \"kubernetes.io/projected/530dc938-f98c-4492-a65a-20e3a9d4750c-kube-api-access-zq8mw\") pod \"kube-auth-proxy-56c874cd68-dzm28\" (UID: \"530dc938-f98c-4492-a65a-20e3a9d4750c\") " pod="openshift-ingress/kube-auth-proxy-56c874cd68-dzm28" Apr 21 01:56:30.746386 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.746357 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/530dc938-f98c-4492-a65a-20e3a9d4750c-tls-certs\") pod \"kube-auth-proxy-56c874cd68-dzm28\" (UID: \"530dc938-f98c-4492-a65a-20e3a9d4750c\") " pod="openshift-ingress/kube-auth-proxy-56c874cd68-dzm28" Apr 21 01:56:30.847347 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.847232 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zq8mw\" (UniqueName: \"kubernetes.io/projected/530dc938-f98c-4492-a65a-20e3a9d4750c-kube-api-access-zq8mw\") pod \"kube-auth-proxy-56c874cd68-dzm28\" (UID: \"530dc938-f98c-4492-a65a-20e3a9d4750c\") " pod="openshift-ingress/kube-auth-proxy-56c874cd68-dzm28" Apr 21 01:56:30.847347 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.847313 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/530dc938-f98c-4492-a65a-20e3a9d4750c-tls-certs\") pod \"kube-auth-proxy-56c874cd68-dzm28\" (UID: \"530dc938-f98c-4492-a65a-20e3a9d4750c\") " pod="openshift-ingress/kube-auth-proxy-56c874cd68-dzm28" Apr 21 01:56:30.847597 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.847368 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/530dc938-f98c-4492-a65a-20e3a9d4750c-tmp\") pod \"kube-auth-proxy-56c874cd68-dzm28\" (UID: \"530dc938-f98c-4492-a65a-20e3a9d4750c\") " pod="openshift-ingress/kube-auth-proxy-56c874cd68-dzm28" Apr 21 01:56:30.849746 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.849715 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/530dc938-f98c-4492-a65a-20e3a9d4750c-tmp\") pod \"kube-auth-proxy-56c874cd68-dzm28\" (UID: \"530dc938-f98c-4492-a65a-20e3a9d4750c\") " pod="openshift-ingress/kube-auth-proxy-56c874cd68-dzm28" Apr 21 01:56:30.849886 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.849870 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/530dc938-f98c-4492-a65a-20e3a9d4750c-tls-certs\") pod \"kube-auth-proxy-56c874cd68-dzm28\" (UID: \"530dc938-f98c-4492-a65a-20e3a9d4750c\") " pod="openshift-ingress/kube-auth-proxy-56c874cd68-dzm28" Apr 21 01:56:30.855568 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.855546 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq8mw\" (UniqueName: \"kubernetes.io/projected/530dc938-f98c-4492-a65a-20e3a9d4750c-kube-api-access-zq8mw\") pod \"kube-auth-proxy-56c874cd68-dzm28\" (UID: \"530dc938-f98c-4492-a65a-20e3a9d4750c\") " pod="openshift-ingress/kube-auth-proxy-56c874cd68-dzm28" Apr 21 01:56:30.923376 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:30.923335 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-56c874cd68-dzm28" Apr 21 01:56:31.052750 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:31.052725 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-56c874cd68-dzm28"] Apr 21 01:56:31.054598 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:56:31.054569 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod530dc938_f98c_4492_a65a_20e3a9d4750c.slice/crio-1b364c680dc0c997f896692efe054d2a0d579319aee7b1c78bf7e73d93058d49 WatchSource:0}: Error finding container 1b364c680dc0c997f896692efe054d2a0d579319aee7b1c78bf7e73d93058d49: Status 404 returned error can't find the container with id 1b364c680dc0c997f896692efe054d2a0d579319aee7b1c78bf7e73d93058d49 Apr 21 01:56:31.460578 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:31.460541 2573 generic.go:358] "Generic (PLEG): container finished" podID="5ba69c9c-9eb3-44b6-ac88-578f8c4b7329" containerID="f4d9536e9d0e04e61676be636aa354def484713d3348c21ff4912cac49acc16c" exitCode=0 Apr 21 01:56:31.460735 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:31.460624 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6" event={"ID":"5ba69c9c-9eb3-44b6-ac88-578f8c4b7329","Type":"ContainerDied","Data":"f4d9536e9d0e04e61676be636aa354def484713d3348c21ff4912cac49acc16c"} Apr 21 01:56:31.461731 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:31.461712 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-56c874cd68-dzm28" event={"ID":"530dc938-f98c-4492-a65a-20e3a9d4750c","Type":"ContainerStarted","Data":"1b364c680dc0c997f896692efe054d2a0d579319aee7b1c78bf7e73d93058d49"} Apr 21 01:56:32.469672 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:32.469578 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6" event={"ID":"5ba69c9c-9eb3-44b6-ac88-578f8c4b7329","Type":"ContainerStarted","Data":"027400d787815aac8ba79e39c969e31574dff583c710e44b2f7e9ad3ec5abfe9"} Apr 21 01:56:33.475020 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:33.474986 2573 generic.go:358] "Generic (PLEG): container finished" podID="5ba69c9c-9eb3-44b6-ac88-578f8c4b7329" containerID="027400d787815aac8ba79e39c969e31574dff583c710e44b2f7e9ad3ec5abfe9" exitCode=0 Apr 21 01:56:33.475420 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:33.475070 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6" event={"ID":"5ba69c9c-9eb3-44b6-ac88-578f8c4b7329","Type":"ContainerDied","Data":"027400d787815aac8ba79e39c969e31574dff583c710e44b2f7e9ad3ec5abfe9"} Apr 21 01:56:34.489089 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:34.489047 2573 generic.go:358] "Generic (PLEG): container finished" podID="5ba69c9c-9eb3-44b6-ac88-578f8c4b7329" containerID="d14a60cc17009591de48c749de68a8a5ebf83f1a2c157faea3f8e112b159fef2" exitCode=0 Apr 21 01:56:34.489474 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:34.489178 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6" event={"ID":"5ba69c9c-9eb3-44b6-ac88-578f8c4b7329","Type":"ContainerDied","Data":"d14a60cc17009591de48c749de68a8a5ebf83f1a2c157faea3f8e112b159fef2"} Apr 21 01:56:35.494067 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:35.494030 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-56c874cd68-dzm28" event={"ID":"530dc938-f98c-4492-a65a-20e3a9d4750c","Type":"ContainerStarted","Data":"1004d801e0928fa8549b1fcc2e7c134747f17646f396e458ac0e4179162d2d50"} Apr 21 01:56:35.513072 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:35.513023 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-56c874cd68-dzm28" podStartSLOduration=1.677398423 podStartE2EDuration="5.513007022s" podCreationTimestamp="2026-04-21 01:56:30 +0000 UTC" firstStartedPulling="2026-04-21 01:56:31.05643415 +0000 UTC m=+388.635736186" lastFinishedPulling="2026-04-21 01:56:34.892042745 +0000 UTC m=+392.471344785" observedRunningTime="2026-04-21 01:56:35.510199848 +0000 UTC m=+393.089502022" watchObservedRunningTime="2026-04-21 01:56:35.513007022 +0000 UTC m=+393.092309141" Apr 21 01:56:35.632112 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:35.632088 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6" Apr 21 01:56:35.791574 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:35.791482 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ba69c9c-9eb3-44b6-ac88-578f8c4b7329-util\") pod \"5ba69c9c-9eb3-44b6-ac88-578f8c4b7329\" (UID: \"5ba69c9c-9eb3-44b6-ac88-578f8c4b7329\") " Apr 21 01:56:35.791574 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:35.791548 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5n7d\" (UniqueName: \"kubernetes.io/projected/5ba69c9c-9eb3-44b6-ac88-578f8c4b7329-kube-api-access-g5n7d\") pod \"5ba69c9c-9eb3-44b6-ac88-578f8c4b7329\" (UID: \"5ba69c9c-9eb3-44b6-ac88-578f8c4b7329\") " Apr 21 01:56:35.791776 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:35.791579 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ba69c9c-9eb3-44b6-ac88-578f8c4b7329-bundle\") pod \"5ba69c9c-9eb3-44b6-ac88-578f8c4b7329\" (UID: \"5ba69c9c-9eb3-44b6-ac88-578f8c4b7329\") " Apr 21 01:56:35.792531 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:35.792499 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ba69c9c-9eb3-44b6-ac88-578f8c4b7329-bundle" (OuterVolumeSpecName: "bundle") pod "5ba69c9c-9eb3-44b6-ac88-578f8c4b7329" (UID: "5ba69c9c-9eb3-44b6-ac88-578f8c4b7329"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:56:35.793681 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:35.793658 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ba69c9c-9eb3-44b6-ac88-578f8c4b7329-kube-api-access-g5n7d" (OuterVolumeSpecName: "kube-api-access-g5n7d") pod "5ba69c9c-9eb3-44b6-ac88-578f8c4b7329" (UID: "5ba69c9c-9eb3-44b6-ac88-578f8c4b7329"). InnerVolumeSpecName "kube-api-access-g5n7d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:56:35.816674 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:35.816629 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ba69c9c-9eb3-44b6-ac88-578f8c4b7329-util" (OuterVolumeSpecName: "util") pod "5ba69c9c-9eb3-44b6-ac88-578f8c4b7329" (UID: "5ba69c9c-9eb3-44b6-ac88-578f8c4b7329"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:56:35.893041 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:35.893000 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ba69c9c-9eb3-44b6-ac88-578f8c4b7329-util\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:56:35.893041 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:35.893032 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g5n7d\" (UniqueName: \"kubernetes.io/projected/5ba69c9c-9eb3-44b6-ac88-578f8c4b7329-kube-api-access-g5n7d\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:56:35.893041 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:35.893044 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ba69c9c-9eb3-44b6-ac88-578f8c4b7329-bundle\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:56:36.499621 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:36.499594 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6" Apr 21 01:56:36.500035 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:36.499598 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835h55d6" event={"ID":"5ba69c9c-9eb3-44b6-ac88-578f8c4b7329","Type":"ContainerDied","Data":"0e487180832c36181fb26aaa53a256a1fef4ac404e277d1bc5d6395f116dc9d3"} Apr 21 01:56:36.500035 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:36.499697 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e487180832c36181fb26aaa53a256a1fef4ac404e277d1bc5d6395f116dc9d3" Apr 21 01:56:43.763316 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:43.763274 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592"] Apr 21 01:56:43.763720 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:43.763678 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ba69c9c-9eb3-44b6-ac88-578f8c4b7329" containerName="extract" Apr 21 01:56:43.763720 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:43.763691 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba69c9c-9eb3-44b6-ac88-578f8c4b7329" containerName="extract" Apr 21 01:56:43.763720 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:43.763701 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ba69c9c-9eb3-44b6-ac88-578f8c4b7329" containerName="pull" Apr 21 01:56:43.763720 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:43.763709 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba69c9c-9eb3-44b6-ac88-578f8c4b7329" containerName="pull" Apr 21 01:56:43.763898 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:43.763744 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ba69c9c-9eb3-44b6-ac88-578f8c4b7329" containerName="util" Apr 21 01:56:43.763898 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:43.763752 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba69c9c-9eb3-44b6-ac88-578f8c4b7329" containerName="util" Apr 21 01:56:43.763898 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:43.763828 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ba69c9c-9eb3-44b6-ac88-578f8c4b7329" containerName="extract" Apr 21 01:56:43.774036 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:43.774007 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592" Apr 21 01:56:43.778196 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:43.778167 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 01:56:43.779174 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:43.779151 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-gj698\"" Apr 21 01:56:43.779307 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:43.779229 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 01:56:43.788304 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:43.788273 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592"] Apr 21 01:56:43.966633 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:43.966591 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6e20efb-af33-40ca-8c8b-52f8a3beeb2b-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592\" (UID: \"d6e20efb-af33-40ca-8c8b-52f8a3beeb2b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592" Apr 21 01:56:43.966810 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:43.966717 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8vzr\" (UniqueName: \"kubernetes.io/projected/d6e20efb-af33-40ca-8c8b-52f8a3beeb2b-kube-api-access-f8vzr\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592\" (UID: \"d6e20efb-af33-40ca-8c8b-52f8a3beeb2b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592" Apr 21 01:56:43.966810 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:43.966787 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6e20efb-af33-40ca-8c8b-52f8a3beeb2b-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592\" (UID: \"d6e20efb-af33-40ca-8c8b-52f8a3beeb2b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592" Apr 21 01:56:44.067655 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:44.067574 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6e20efb-af33-40ca-8c8b-52f8a3beeb2b-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592\" (UID: \"d6e20efb-af33-40ca-8c8b-52f8a3beeb2b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592" Apr 21 01:56:44.067655 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:44.067615 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6e20efb-af33-40ca-8c8b-52f8a3beeb2b-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592\" (UID: \"d6e20efb-af33-40ca-8c8b-52f8a3beeb2b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592" Apr 21 01:56:44.067909 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:44.067678 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8vzr\" (UniqueName: \"kubernetes.io/projected/d6e20efb-af33-40ca-8c8b-52f8a3beeb2b-kube-api-access-f8vzr\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592\" (UID: \"d6e20efb-af33-40ca-8c8b-52f8a3beeb2b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592" Apr 21 01:56:44.067983 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:44.067960 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6e20efb-af33-40ca-8c8b-52f8a3beeb2b-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592\" (UID: \"d6e20efb-af33-40ca-8c8b-52f8a3beeb2b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592" Apr 21 01:56:44.068050 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:44.068032 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6e20efb-af33-40ca-8c8b-52f8a3beeb2b-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592\" (UID: \"d6e20efb-af33-40ca-8c8b-52f8a3beeb2b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592" Apr 21 01:56:44.075800 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:44.075771 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8vzr\" (UniqueName: \"kubernetes.io/projected/d6e20efb-af33-40ca-8c8b-52f8a3beeb2b-kube-api-access-f8vzr\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592\" (UID: \"d6e20efb-af33-40ca-8c8b-52f8a3beeb2b\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592" Apr 21 01:56:44.084617 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:44.084591 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592" Apr 21 01:56:44.228711 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:44.228677 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592"] Apr 21 01:56:44.229872 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:56:44.229842 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6e20efb_af33_40ca_8c8b_52f8a3beeb2b.slice/crio-68b5c261460775b4fe19c9fc32ed0b98fc19beaf9176629acd853eede1377a0a WatchSource:0}: Error finding container 68b5c261460775b4fe19c9fc32ed0b98fc19beaf9176629acd853eede1377a0a: Status 404 returned error can't find the container with id 68b5c261460775b4fe19c9fc32ed0b98fc19beaf9176629acd853eede1377a0a Apr 21 01:56:44.532725 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:44.532690 2573 generic.go:358] "Generic (PLEG): container finished" podID="d6e20efb-af33-40ca-8c8b-52f8a3beeb2b" containerID="41a1ac2a557ff4c35893d1a9fb46294e8a141a20f3ca70a7dd13da950b0c2fb8" exitCode=0 Apr 21 01:56:44.532916 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:44.532774 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592" event={"ID":"d6e20efb-af33-40ca-8c8b-52f8a3beeb2b","Type":"ContainerDied","Data":"41a1ac2a557ff4c35893d1a9fb46294e8a141a20f3ca70a7dd13da950b0c2fb8"} Apr 21 01:56:44.532916 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:44.532810 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592" event={"ID":"d6e20efb-af33-40ca-8c8b-52f8a3beeb2b","Type":"ContainerStarted","Data":"68b5c261460775b4fe19c9fc32ed0b98fc19beaf9176629acd853eede1377a0a"} Apr 21 01:56:46.542130 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:46.542092 2573 generic.go:358] "Generic (PLEG): container finished" podID="d6e20efb-af33-40ca-8c8b-52f8a3beeb2b" containerID="b9596b21e54d09a90388c0dafca8919ff231ae314ce565409e1d2a60bfe6d1be" exitCode=0 Apr 21 01:56:46.542622 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:46.542173 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592" event={"ID":"d6e20efb-af33-40ca-8c8b-52f8a3beeb2b","Type":"ContainerDied","Data":"b9596b21e54d09a90388c0dafca8919ff231ae314ce565409e1d2a60bfe6d1be"} Apr 21 01:56:47.547255 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:47.547223 2573 generic.go:358] "Generic (PLEG): container finished" podID="d6e20efb-af33-40ca-8c8b-52f8a3beeb2b" containerID="d31524766b9caf0041a19255cb3d530b99c30a882db6b95998242c6cefa9db04" exitCode=0 Apr 21 01:56:47.547736 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:47.547278 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592" event={"ID":"d6e20efb-af33-40ca-8c8b-52f8a3beeb2b","Type":"ContainerDied","Data":"d31524766b9caf0041a19255cb3d530b99c30a882db6b95998242c6cefa9db04"} Apr 21 01:56:48.677409 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:48.677385 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592" Apr 21 01:56:48.710642 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:48.710609 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6e20efb-af33-40ca-8c8b-52f8a3beeb2b-bundle\") pod \"d6e20efb-af33-40ca-8c8b-52f8a3beeb2b\" (UID: \"d6e20efb-af33-40ca-8c8b-52f8a3beeb2b\") " Apr 21 01:56:48.710642 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:48.710649 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6e20efb-af33-40ca-8c8b-52f8a3beeb2b-util\") pod \"d6e20efb-af33-40ca-8c8b-52f8a3beeb2b\" (UID: \"d6e20efb-af33-40ca-8c8b-52f8a3beeb2b\") " Apr 21 01:56:48.710914 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:48.710733 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8vzr\" (UniqueName: \"kubernetes.io/projected/d6e20efb-af33-40ca-8c8b-52f8a3beeb2b-kube-api-access-f8vzr\") pod \"d6e20efb-af33-40ca-8c8b-52f8a3beeb2b\" (UID: \"d6e20efb-af33-40ca-8c8b-52f8a3beeb2b\") " Apr 21 01:56:48.711536 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:48.711504 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6e20efb-af33-40ca-8c8b-52f8a3beeb2b-bundle" (OuterVolumeSpecName: "bundle") pod "d6e20efb-af33-40ca-8c8b-52f8a3beeb2b" (UID: "d6e20efb-af33-40ca-8c8b-52f8a3beeb2b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:56:48.713007 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:48.712976 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e20efb-af33-40ca-8c8b-52f8a3beeb2b-kube-api-access-f8vzr" (OuterVolumeSpecName: "kube-api-access-f8vzr") pod "d6e20efb-af33-40ca-8c8b-52f8a3beeb2b" (UID: "d6e20efb-af33-40ca-8c8b-52f8a3beeb2b"). InnerVolumeSpecName "kube-api-access-f8vzr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:56:48.716326 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:48.716300 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6e20efb-af33-40ca-8c8b-52f8a3beeb2b-util" (OuterVolumeSpecName: "util") pod "d6e20efb-af33-40ca-8c8b-52f8a3beeb2b" (UID: "d6e20efb-af33-40ca-8c8b-52f8a3beeb2b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:56:48.811545 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:48.811456 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d6e20efb-af33-40ca-8c8b-52f8a3beeb2b-bundle\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:56:48.811545 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:48.811486 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d6e20efb-af33-40ca-8c8b-52f8a3beeb2b-util\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:56:48.811545 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:48.811498 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f8vzr\" (UniqueName: \"kubernetes.io/projected/d6e20efb-af33-40ca-8c8b-52f8a3beeb2b-kube-api-access-f8vzr\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:56:49.557014 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:49.556983 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592" Apr 21 01:56:49.557185 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:49.556985 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gl592" event={"ID":"d6e20efb-af33-40ca-8c8b-52f8a3beeb2b","Type":"ContainerDied","Data":"68b5c261460775b4fe19c9fc32ed0b98fc19beaf9176629acd853eede1377a0a"} Apr 21 01:56:49.557185 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:56:49.557084 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68b5c261460775b4fe19c9fc32ed0b98fc19beaf9176629acd853eede1377a0a" Apr 21 01:57:46.376271 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.376234 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6"] Apr 21 01:57:46.376737 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.376617 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6e20efb-af33-40ca-8c8b-52f8a3beeb2b" containerName="util" Apr 21 01:57:46.376737 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.376629 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e20efb-af33-40ca-8c8b-52f8a3beeb2b" containerName="util" Apr 21 01:57:46.376737 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.376636 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6e20efb-af33-40ca-8c8b-52f8a3beeb2b" containerName="pull" Apr 21 01:57:46.376737 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.376641 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e20efb-af33-40ca-8c8b-52f8a3beeb2b" containerName="pull" Apr 21 01:57:46.376737 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.376651 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6e20efb-af33-40ca-8c8b-52f8a3beeb2b" containerName="extract" Apr 21 01:57:46.376737 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.376656 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e20efb-af33-40ca-8c8b-52f8a3beeb2b" containerName="extract" Apr 21 01:57:46.376737 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.376711 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6e20efb-af33-40ca-8c8b-52f8a3beeb2b" containerName="extract" Apr 21 01:57:46.379882 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.379865 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6" Apr 21 01:57:46.382776 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.382753 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 01:57:46.382926 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.382828 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-bmpqw\"" Apr 21 01:57:46.383738 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.383722 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 01:57:46.389018 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.388996 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6"] Apr 21 01:57:46.411405 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.411376 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clr5w\" (UniqueName: \"kubernetes.io/projected/04d6ec82-73db-455a-856e-49cff06f126b-kube-api-access-clr5w\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6\" (UID: \"04d6ec82-73db-455a-856e-49cff06f126b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6" Apr 21 01:57:46.411567 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.411423 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04d6ec82-73db-455a-856e-49cff06f126b-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6\" (UID: \"04d6ec82-73db-455a-856e-49cff06f126b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6" Apr 21 01:57:46.411567 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.411488 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04d6ec82-73db-455a-856e-49cff06f126b-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6\" (UID: \"04d6ec82-73db-455a-856e-49cff06f126b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6" Apr 21 01:57:46.512130 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.512086 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clr5w\" (UniqueName: \"kubernetes.io/projected/04d6ec82-73db-455a-856e-49cff06f126b-kube-api-access-clr5w\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6\" (UID: \"04d6ec82-73db-455a-856e-49cff06f126b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6" Apr 21 01:57:46.512338 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.512148 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04d6ec82-73db-455a-856e-49cff06f126b-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6\" (UID: \"04d6ec82-73db-455a-856e-49cff06f126b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6" Apr 21 01:57:46.512338 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.512218 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04d6ec82-73db-455a-856e-49cff06f126b-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6\" (UID: \"04d6ec82-73db-455a-856e-49cff06f126b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6" Apr 21 01:57:46.512560 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.512535 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04d6ec82-73db-455a-856e-49cff06f126b-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6\" (UID: \"04d6ec82-73db-455a-856e-49cff06f126b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6" Apr 21 01:57:46.512625 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.512559 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04d6ec82-73db-455a-856e-49cff06f126b-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6\" (UID: \"04d6ec82-73db-455a-856e-49cff06f126b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6" Apr 21 01:57:46.520596 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.520563 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clr5w\" (UniqueName: \"kubernetes.io/projected/04d6ec82-73db-455a-856e-49cff06f126b-kube-api-access-clr5w\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6\" (UID: \"04d6ec82-73db-455a-856e-49cff06f126b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6" Apr 21 01:57:46.690896 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.690786 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6" Apr 21 01:57:46.816924 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:46.816890 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6"] Apr 21 01:57:46.819962 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:57:46.819932 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04d6ec82_73db_455a_856e_49cff06f126b.slice/crio-64683b8ba1978a706c2c4f8708610ca5a160c06744c9678c5ff005dd7201b2df WatchSource:0}: Error finding container 64683b8ba1978a706c2c4f8708610ca5a160c06744c9678c5ff005dd7201b2df: Status 404 returned error can't find the container with id 64683b8ba1978a706c2c4f8708610ca5a160c06744c9678c5ff005dd7201b2df Apr 21 01:57:47.126353 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.126326 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc"] Apr 21 01:57:47.129937 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.129918 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc" Apr 21 01:57:47.137526 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.137499 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc"] Apr 21 01:57:47.217105 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.217063 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fll76\" (UniqueName: \"kubernetes.io/projected/b88f3b10-9e35-4408-8341-e1cac20e0332-kube-api-access-fll76\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc\" (UID: \"b88f3b10-9e35-4408-8341-e1cac20e0332\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc" Apr 21 01:57:47.217279 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.217136 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b88f3b10-9e35-4408-8341-e1cac20e0332-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc\" (UID: \"b88f3b10-9e35-4408-8341-e1cac20e0332\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc" Apr 21 01:57:47.217279 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.217162 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b88f3b10-9e35-4408-8341-e1cac20e0332-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc\" (UID: \"b88f3b10-9e35-4408-8341-e1cac20e0332\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc" Apr 21 01:57:47.317640 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.317598 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b88f3b10-9e35-4408-8341-e1cac20e0332-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc\" (UID: \"b88f3b10-9e35-4408-8341-e1cac20e0332\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc" Apr 21 01:57:47.317640 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.317646 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b88f3b10-9e35-4408-8341-e1cac20e0332-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc\" (UID: \"b88f3b10-9e35-4408-8341-e1cac20e0332\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc" Apr 21 01:57:47.317897 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.317697 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fll76\" (UniqueName: \"kubernetes.io/projected/b88f3b10-9e35-4408-8341-e1cac20e0332-kube-api-access-fll76\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc\" (UID: \"b88f3b10-9e35-4408-8341-e1cac20e0332\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc" Apr 21 01:57:47.318018 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.317997 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b88f3b10-9e35-4408-8341-e1cac20e0332-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc\" (UID: \"b88f3b10-9e35-4408-8341-e1cac20e0332\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc" Apr 21 01:57:47.318061 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.318022 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b88f3b10-9e35-4408-8341-e1cac20e0332-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc\" (UID: \"b88f3b10-9e35-4408-8341-e1cac20e0332\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc" Apr 21 01:57:47.325296 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.325272 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fll76\" (UniqueName: \"kubernetes.io/projected/b88f3b10-9e35-4408-8341-e1cac20e0332-kube-api-access-fll76\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc\" (UID: \"b88f3b10-9e35-4408-8341-e1cac20e0332\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc" Apr 21 01:57:47.439770 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.439668 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc" Apr 21 01:57:47.566835 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.566796 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc"] Apr 21 01:57:47.568445 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:57:47.568413 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb88f3b10_9e35_4408_8341_e1cac20e0332.slice/crio-7528c49ef8064d6d277b9b29cbd93ac7d88f27bb276c74a153746f4fa10a73ce WatchSource:0}: Error finding container 7528c49ef8064d6d277b9b29cbd93ac7d88f27bb276c74a153746f4fa10a73ce: Status 404 returned error can't find the container with id 7528c49ef8064d6d277b9b29cbd93ac7d88f27bb276c74a153746f4fa10a73ce Apr 21 01:57:47.717764 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.717681 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt"] Apr 21 01:57:47.721215 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.721196 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt" Apr 21 01:57:47.727988 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.727965 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt"] Apr 21 01:57:47.777588 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.777553 2573 generic.go:358] "Generic (PLEG): container finished" podID="b88f3b10-9e35-4408-8341-e1cac20e0332" containerID="cf7563d4cd4d57b3376c561afa26544e2f93fe602b7e23bba50cd1558b2d97fb" exitCode=0 Apr 21 01:57:47.777775 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.777633 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc" event={"ID":"b88f3b10-9e35-4408-8341-e1cac20e0332","Type":"ContainerDied","Data":"cf7563d4cd4d57b3376c561afa26544e2f93fe602b7e23bba50cd1558b2d97fb"} Apr 21 01:57:47.777775 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.777660 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc" event={"ID":"b88f3b10-9e35-4408-8341-e1cac20e0332","Type":"ContainerStarted","Data":"7528c49ef8064d6d277b9b29cbd93ac7d88f27bb276c74a153746f4fa10a73ce"} Apr 21 01:57:47.779108 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.779087 2573 generic.go:358] "Generic (PLEG): container finished" podID="04d6ec82-73db-455a-856e-49cff06f126b" containerID="98d2db72d8b50da1b8bdf62084c2e561189f1f5b67aa5228c3d3a57161896df0" exitCode=0 Apr 21 01:57:47.779183 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.779122 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6" event={"ID":"04d6ec82-73db-455a-856e-49cff06f126b","Type":"ContainerDied","Data":"98d2db72d8b50da1b8bdf62084c2e561189f1f5b67aa5228c3d3a57161896df0"} Apr 21 01:57:47.779183 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.779140 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6" event={"ID":"04d6ec82-73db-455a-856e-49cff06f126b","Type":"ContainerStarted","Data":"64683b8ba1978a706c2c4f8708610ca5a160c06744c9678c5ff005dd7201b2df"} Apr 21 01:57:47.821514 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.821478 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f02c586b-d484-49c2-aa48-94fc19cf622b-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt\" (UID: \"f02c586b-d484-49c2-aa48-94fc19cf622b\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt" Apr 21 01:57:47.821692 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.821538 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjnbp\" (UniqueName: \"kubernetes.io/projected/f02c586b-d484-49c2-aa48-94fc19cf622b-kube-api-access-fjnbp\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt\" (UID: \"f02c586b-d484-49c2-aa48-94fc19cf622b\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt" Apr 21 01:57:47.821692 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.821568 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f02c586b-d484-49c2-aa48-94fc19cf622b-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt\" (UID: \"f02c586b-d484-49c2-aa48-94fc19cf622b\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt" Apr 21 01:57:47.922160 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.922123 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f02c586b-d484-49c2-aa48-94fc19cf622b-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt\" (UID: \"f02c586b-d484-49c2-aa48-94fc19cf622b\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt" Apr 21 01:57:47.922160 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.922168 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjnbp\" (UniqueName: \"kubernetes.io/projected/f02c586b-d484-49c2-aa48-94fc19cf622b-kube-api-access-fjnbp\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt\" (UID: \"f02c586b-d484-49c2-aa48-94fc19cf622b\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt" Apr 21 01:57:47.922373 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.922201 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f02c586b-d484-49c2-aa48-94fc19cf622b-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt\" (UID: \"f02c586b-d484-49c2-aa48-94fc19cf622b\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt" Apr 21 01:57:47.922522 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.922505 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f02c586b-d484-49c2-aa48-94fc19cf622b-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt\" (UID: \"f02c586b-d484-49c2-aa48-94fc19cf622b\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt" Apr 21 01:57:47.922615 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.922598 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f02c586b-d484-49c2-aa48-94fc19cf622b-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt\" (UID: \"f02c586b-d484-49c2-aa48-94fc19cf622b\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt" Apr 21 01:57:47.929765 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:47.929733 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjnbp\" (UniqueName: \"kubernetes.io/projected/f02c586b-d484-49c2-aa48-94fc19cf622b-kube-api-access-fjnbp\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt\" (UID: \"f02c586b-d484-49c2-aa48-94fc19cf622b\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt" Apr 21 01:57:48.041469 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.041385 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt" Apr 21 01:57:48.119532 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.119496 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f"] Apr 21 01:57:48.124622 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.124594 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f" Apr 21 01:57:48.132104 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.132076 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f"] Apr 21 01:57:48.172152 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.172125 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt"] Apr 21 01:57:48.174515 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:57:48.174480 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf02c586b_d484_49c2_aa48_94fc19cf622b.slice/crio-2afffeb546116934bd22b410e0bbede84cf63a28a9660c9d70486a5afa7f3964 WatchSource:0}: Error finding container 2afffeb546116934bd22b410e0bbede84cf63a28a9660c9d70486a5afa7f3964: Status 404 returned error can't find the container with id 2afffeb546116934bd22b410e0bbede84cf63a28a9660c9d70486a5afa7f3964 Apr 21 01:57:48.224274 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.224248 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x589l\" (UniqueName: \"kubernetes.io/projected/1000fd17-2fc7-4cc5-bdb8-15d3062b2f81-kube-api-access-x589l\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f\" (UID: \"1000fd17-2fc7-4cc5-bdb8-15d3062b2f81\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f" Apr 21 01:57:48.224399 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.224287 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1000fd17-2fc7-4cc5-bdb8-15d3062b2f81-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f\" (UID: \"1000fd17-2fc7-4cc5-bdb8-15d3062b2f81\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f" Apr 21 01:57:48.224399 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.224359 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1000fd17-2fc7-4cc5-bdb8-15d3062b2f81-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f\" (UID: \"1000fd17-2fc7-4cc5-bdb8-15d3062b2f81\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f" Apr 21 01:57:48.325572 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.325500 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x589l\" (UniqueName: \"kubernetes.io/projected/1000fd17-2fc7-4cc5-bdb8-15d3062b2f81-kube-api-access-x589l\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f\" (UID: \"1000fd17-2fc7-4cc5-bdb8-15d3062b2f81\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f" Apr 21 01:57:48.325572 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.325537 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1000fd17-2fc7-4cc5-bdb8-15d3062b2f81-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f\" (UID: \"1000fd17-2fc7-4cc5-bdb8-15d3062b2f81\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f" Apr 21 01:57:48.325764 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.325580 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1000fd17-2fc7-4cc5-bdb8-15d3062b2f81-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f\" (UID: \"1000fd17-2fc7-4cc5-bdb8-15d3062b2f81\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f" Apr 21 01:57:48.325965 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.325947 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1000fd17-2fc7-4cc5-bdb8-15d3062b2f81-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f\" (UID: \"1000fd17-2fc7-4cc5-bdb8-15d3062b2f81\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f" Apr 21 01:57:48.326026 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.325963 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1000fd17-2fc7-4cc5-bdb8-15d3062b2f81-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f\" (UID: \"1000fd17-2fc7-4cc5-bdb8-15d3062b2f81\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f" Apr 21 01:57:48.335021 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.333751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x589l\" (UniqueName: \"kubernetes.io/projected/1000fd17-2fc7-4cc5-bdb8-15d3062b2f81-kube-api-access-x589l\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f\" (UID: \"1000fd17-2fc7-4cc5-bdb8-15d3062b2f81\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f" Apr 21 01:57:48.436528 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.436491 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f" Apr 21 01:57:48.685270 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.685244 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f"] Apr 21 01:57:48.721368 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:57:48.721335 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1000fd17_2fc7_4cc5_bdb8_15d3062b2f81.slice/crio-b09661faf6e11a2c5a871b518f54754f4f179fc378c55d4e87fc4c3b13c19a69 WatchSource:0}: Error finding container b09661faf6e11a2c5a871b518f54754f4f179fc378c55d4e87fc4c3b13c19a69: Status 404 returned error can't find the container with id b09661faf6e11a2c5a871b518f54754f4f179fc378c55d4e87fc4c3b13c19a69 Apr 21 01:57:48.785256 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.785224 2573 generic.go:358] "Generic (PLEG): container finished" podID="b88f3b10-9e35-4408-8341-e1cac20e0332" containerID="ae27827ea746fd5236a7e1226851455bb81a9082e2af6b676ad58f5873570d4d" exitCode=0 Apr 21 01:57:48.785391 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.785289 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc" event={"ID":"b88f3b10-9e35-4408-8341-e1cac20e0332","Type":"ContainerDied","Data":"ae27827ea746fd5236a7e1226851455bb81a9082e2af6b676ad58f5873570d4d"} Apr 21 01:57:48.787415 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.787388 2573 generic.go:358] "Generic (PLEG): container finished" podID="04d6ec82-73db-455a-856e-49cff06f126b" containerID="4b5d4e8d318b46133ec4b006b38ac161669b2d4feeda1596692ae16e19dd3054" exitCode=0 Apr 21 01:57:48.787504 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.787468 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6" event={"ID":"04d6ec82-73db-455a-856e-49cff06f126b","Type":"ContainerDied","Data":"4b5d4e8d318b46133ec4b006b38ac161669b2d4feeda1596692ae16e19dd3054"} Apr 21 01:57:48.789114 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.789061 2573 generic.go:358] "Generic (PLEG): container finished" podID="f02c586b-d484-49c2-aa48-94fc19cf622b" containerID="f4a7ce81ec7129ef518617f82de7d3031df415f9c7f1c6756ef709179660dc8b" exitCode=0 Apr 21 01:57:48.789178 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.789165 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt" event={"ID":"f02c586b-d484-49c2-aa48-94fc19cf622b","Type":"ContainerDied","Data":"f4a7ce81ec7129ef518617f82de7d3031df415f9c7f1c6756ef709179660dc8b"} Apr 21 01:57:48.789230 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.789190 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt" event={"ID":"f02c586b-d484-49c2-aa48-94fc19cf622b","Type":"ContainerStarted","Data":"2afffeb546116934bd22b410e0bbede84cf63a28a9660c9d70486a5afa7f3964"} Apr 21 01:57:48.790258 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:48.790239 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f" event={"ID":"1000fd17-2fc7-4cc5-bdb8-15d3062b2f81","Type":"ContainerStarted","Data":"b09661faf6e11a2c5a871b518f54754f4f179fc378c55d4e87fc4c3b13c19a69"} Apr 21 01:57:49.795904 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:49.795789 2573 generic.go:358] "Generic (PLEG): container finished" podID="b88f3b10-9e35-4408-8341-e1cac20e0332" containerID="db81e0d509fdbb1892789513680dd5c074b751a561333538213ec4b34e53327b" exitCode=0 Apr 21 01:57:49.795904 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:49.795850 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc" event={"ID":"b88f3b10-9e35-4408-8341-e1cac20e0332","Type":"ContainerDied","Data":"db81e0d509fdbb1892789513680dd5c074b751a561333538213ec4b34e53327b"} Apr 21 01:57:49.797571 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:49.797551 2573 generic.go:358] "Generic (PLEG): container finished" podID="04d6ec82-73db-455a-856e-49cff06f126b" containerID="9dba79dab5d790d3b424a53cc71991b2fecb6dd239f31075cab70f02aaa3cc7e" exitCode=0 Apr 21 01:57:49.797687 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:49.797601 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6" event={"ID":"04d6ec82-73db-455a-856e-49cff06f126b","Type":"ContainerDied","Data":"9dba79dab5d790d3b424a53cc71991b2fecb6dd239f31075cab70f02aaa3cc7e"} Apr 21 01:57:49.799188 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:49.799166 2573 generic.go:358] "Generic (PLEG): container finished" podID="f02c586b-d484-49c2-aa48-94fc19cf622b" containerID="31f59181c15553ac9dec15db45353e93cffeffac008e6c695e60d161b585412d" exitCode=0 Apr 21 01:57:49.799284 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:49.799252 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt" event={"ID":"f02c586b-d484-49c2-aa48-94fc19cf622b","Type":"ContainerDied","Data":"31f59181c15553ac9dec15db45353e93cffeffac008e6c695e60d161b585412d"} Apr 21 01:57:49.800611 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:49.800590 2573 generic.go:358] "Generic (PLEG): container finished" podID="1000fd17-2fc7-4cc5-bdb8-15d3062b2f81" containerID="15ec343ca8078a7bf72aa0711b76fc7043df5dcf2c03e0d06580390d2c8cb92c" exitCode=0 Apr 21 01:57:49.800691 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:49.800636 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f" event={"ID":"1000fd17-2fc7-4cc5-bdb8-15d3062b2f81","Type":"ContainerDied","Data":"15ec343ca8078a7bf72aa0711b76fc7043df5dcf2c03e0d06580390d2c8cb92c"} Apr 21 01:57:50.806629 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:50.806588 2573 generic.go:358] "Generic (PLEG): container finished" podID="f02c586b-d484-49c2-aa48-94fc19cf622b" containerID="c5aac269764634d2cf1bcf4d3d790fc275cdcae3a517c8833fa9ff829b1cc69e" exitCode=0 Apr 21 01:57:50.807062 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:50.806644 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt" event={"ID":"f02c586b-d484-49c2-aa48-94fc19cf622b","Type":"ContainerDied","Data":"c5aac269764634d2cf1bcf4d3d790fc275cdcae3a517c8833fa9ff829b1cc69e"} Apr 21 01:57:50.808251 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:50.808227 2573 generic.go:358] "Generic (PLEG): container finished" podID="1000fd17-2fc7-4cc5-bdb8-15d3062b2f81" containerID="756af9d6de95e78a7fd74e3e04a14de6f83f4978b76a2bf1b917514a91ac4706" exitCode=0 Apr 21 01:57:50.808362 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:50.808281 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f" event={"ID":"1000fd17-2fc7-4cc5-bdb8-15d3062b2f81","Type":"ContainerDied","Data":"756af9d6de95e78a7fd74e3e04a14de6f83f4978b76a2bf1b917514a91ac4706"} Apr 21 01:57:50.943420 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:50.943381 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc" Apr 21 01:57:50.950235 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:50.950206 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fll76\" (UniqueName: \"kubernetes.io/projected/b88f3b10-9e35-4408-8341-e1cac20e0332-kube-api-access-fll76\") pod \"b88f3b10-9e35-4408-8341-e1cac20e0332\" (UID: \"b88f3b10-9e35-4408-8341-e1cac20e0332\") " Apr 21 01:57:50.950358 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:50.950251 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b88f3b10-9e35-4408-8341-e1cac20e0332-util\") pod \"b88f3b10-9e35-4408-8341-e1cac20e0332\" (UID: \"b88f3b10-9e35-4408-8341-e1cac20e0332\") " Apr 21 01:57:50.950358 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:50.950305 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b88f3b10-9e35-4408-8341-e1cac20e0332-bundle\") pod \"b88f3b10-9e35-4408-8341-e1cac20e0332\" (UID: \"b88f3b10-9e35-4408-8341-e1cac20e0332\") " Apr 21 01:57:50.951094 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:50.951062 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b88f3b10-9e35-4408-8341-e1cac20e0332-bundle" (OuterVolumeSpecName: "bundle") pod "b88f3b10-9e35-4408-8341-e1cac20e0332" (UID: "b88f3b10-9e35-4408-8341-e1cac20e0332"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:57:50.952919 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:50.952890 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b88f3b10-9e35-4408-8341-e1cac20e0332-kube-api-access-fll76" (OuterVolumeSpecName: "kube-api-access-fll76") pod "b88f3b10-9e35-4408-8341-e1cac20e0332" (UID: "b88f3b10-9e35-4408-8341-e1cac20e0332"). InnerVolumeSpecName "kube-api-access-fll76". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:57:50.958026 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:50.957999 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b88f3b10-9e35-4408-8341-e1cac20e0332-util" (OuterVolumeSpecName: "util") pod "b88f3b10-9e35-4408-8341-e1cac20e0332" (UID: "b88f3b10-9e35-4408-8341-e1cac20e0332"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:57:51.017826 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.017786 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6" Apr 21 01:57:51.050854 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.050796 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04d6ec82-73db-455a-856e-49cff06f126b-util\") pod \"04d6ec82-73db-455a-856e-49cff06f126b\" (UID: \"04d6ec82-73db-455a-856e-49cff06f126b\") " Apr 21 01:57:51.051053 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.050884 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04d6ec82-73db-455a-856e-49cff06f126b-bundle\") pod \"04d6ec82-73db-455a-856e-49cff06f126b\" (UID: \"04d6ec82-73db-455a-856e-49cff06f126b\") " Apr 21 01:57:51.051053 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.050916 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clr5w\" (UniqueName: \"kubernetes.io/projected/04d6ec82-73db-455a-856e-49cff06f126b-kube-api-access-clr5w\") pod \"04d6ec82-73db-455a-856e-49cff06f126b\" (UID: \"04d6ec82-73db-455a-856e-49cff06f126b\") " Apr 21 01:57:51.051223 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.051204 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fll76\" (UniqueName: \"kubernetes.io/projected/b88f3b10-9e35-4408-8341-e1cac20e0332-kube-api-access-fll76\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:57:51.051273 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.051229 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b88f3b10-9e35-4408-8341-e1cac20e0332-util\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:57:51.051273 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.051245 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b88f3b10-9e35-4408-8341-e1cac20e0332-bundle\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:57:51.051521 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.051498 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04d6ec82-73db-455a-856e-49cff06f126b-bundle" (OuterVolumeSpecName: "bundle") pod "04d6ec82-73db-455a-856e-49cff06f126b" (UID: "04d6ec82-73db-455a-856e-49cff06f126b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:57:51.053022 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.053000 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04d6ec82-73db-455a-856e-49cff06f126b-kube-api-access-clr5w" (OuterVolumeSpecName: "kube-api-access-clr5w") pod "04d6ec82-73db-455a-856e-49cff06f126b" (UID: "04d6ec82-73db-455a-856e-49cff06f126b"). InnerVolumeSpecName "kube-api-access-clr5w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:57:51.056634 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.056595 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04d6ec82-73db-455a-856e-49cff06f126b-util" (OuterVolumeSpecName: "util") pod "04d6ec82-73db-455a-856e-49cff06f126b" (UID: "04d6ec82-73db-455a-856e-49cff06f126b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:57:51.152133 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.152100 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04d6ec82-73db-455a-856e-49cff06f126b-util\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:57:51.152133 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.152127 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04d6ec82-73db-455a-856e-49cff06f126b-bundle\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:57:51.152133 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.152137 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-clr5w\" (UniqueName: \"kubernetes.io/projected/04d6ec82-73db-455a-856e-49cff06f126b-kube-api-access-clr5w\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:57:51.814104 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.814071 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc" event={"ID":"b88f3b10-9e35-4408-8341-e1cac20e0332","Type":"ContainerDied","Data":"7528c49ef8064d6d277b9b29cbd93ac7d88f27bb276c74a153746f4fa10a73ce"} Apr 21 01:57:51.814104 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.814104 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7528c49ef8064d6d277b9b29cbd93ac7d88f27bb276c74a153746f4fa10a73ce" Apr 21 01:57:51.814104 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.814104 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc" Apr 21 01:57:51.815734 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.815711 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6" Apr 21 01:57:51.815891 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.815713 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6" event={"ID":"04d6ec82-73db-455a-856e-49cff06f126b","Type":"ContainerDied","Data":"64683b8ba1978a706c2c4f8708610ca5a160c06744c9678c5ff005dd7201b2df"} Apr 21 01:57:51.815891 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.815846 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64683b8ba1978a706c2c4f8708610ca5a160c06744c9678c5ff005dd7201b2df" Apr 21 01:57:51.817755 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.817725 2573 generic.go:358] "Generic (PLEG): container finished" podID="1000fd17-2fc7-4cc5-bdb8-15d3062b2f81" containerID="b4feb2f0e7ad1c0a8cc9e24fca8c309c2289e2591b53ece7bd8658b4d1e94bd3" exitCode=0 Apr 21 01:57:51.817947 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.817845 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f" event={"ID":"1000fd17-2fc7-4cc5-bdb8-15d3062b2f81","Type":"ContainerDied","Data":"b4feb2f0e7ad1c0a8cc9e24fca8c309c2289e2591b53ece7bd8658b4d1e94bd3"} Apr 21 01:57:51.950387 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.950362 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt" Apr 21 01:57:51.957770 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.957747 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f02c586b-d484-49c2-aa48-94fc19cf622b-bundle\") pod \"f02c586b-d484-49c2-aa48-94fc19cf622b\" (UID: \"f02c586b-d484-49c2-aa48-94fc19cf622b\") " Apr 21 01:57:51.957868 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.957776 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f02c586b-d484-49c2-aa48-94fc19cf622b-util\") pod \"f02c586b-d484-49c2-aa48-94fc19cf622b\" (UID: \"f02c586b-d484-49c2-aa48-94fc19cf622b\") " Apr 21 01:57:51.957868 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.957801 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjnbp\" (UniqueName: \"kubernetes.io/projected/f02c586b-d484-49c2-aa48-94fc19cf622b-kube-api-access-fjnbp\") pod \"f02c586b-d484-49c2-aa48-94fc19cf622b\" (UID: \"f02c586b-d484-49c2-aa48-94fc19cf622b\") " Apr 21 01:57:51.958249 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.958225 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f02c586b-d484-49c2-aa48-94fc19cf622b-bundle" (OuterVolumeSpecName: "bundle") pod "f02c586b-d484-49c2-aa48-94fc19cf622b" (UID: "f02c586b-d484-49c2-aa48-94fc19cf622b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:57:51.959980 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.959955 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f02c586b-d484-49c2-aa48-94fc19cf622b-kube-api-access-fjnbp" (OuterVolumeSpecName: "kube-api-access-fjnbp") pod "f02c586b-d484-49c2-aa48-94fc19cf622b" (UID: "f02c586b-d484-49c2-aa48-94fc19cf622b"). InnerVolumeSpecName "kube-api-access-fjnbp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:57:51.965428 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:51.965397 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f02c586b-d484-49c2-aa48-94fc19cf622b-util" (OuterVolumeSpecName: "util") pod "f02c586b-d484-49c2-aa48-94fc19cf622b" (UID: "f02c586b-d484-49c2-aa48-94fc19cf622b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:57:52.059060 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:52.059027 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f02c586b-d484-49c2-aa48-94fc19cf622b-bundle\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:57:52.059060 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:52.059056 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f02c586b-d484-49c2-aa48-94fc19cf622b-util\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:57:52.059060 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:52.059066 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fjnbp\" (UniqueName: \"kubernetes.io/projected/f02c586b-d484-49c2-aa48-94fc19cf622b-kube-api-access-fjnbp\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:57:52.822899 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:52.822860 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt" event={"ID":"f02c586b-d484-49c2-aa48-94fc19cf622b","Type":"ContainerDied","Data":"2afffeb546116934bd22b410e0bbede84cf63a28a9660c9d70486a5afa7f3964"} Apr 21 01:57:52.822899 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:52.822900 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2afffeb546116934bd22b410e0bbede84cf63a28a9660c9d70486a5afa7f3964" Apr 21 01:57:52.822899 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:52.822903 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt" Apr 21 01:57:52.949805 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:52.949780 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f" Apr 21 01:57:52.966550 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:52.966520 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1000fd17-2fc7-4cc5-bdb8-15d3062b2f81-bundle\") pod \"1000fd17-2fc7-4cc5-bdb8-15d3062b2f81\" (UID: \"1000fd17-2fc7-4cc5-bdb8-15d3062b2f81\") " Apr 21 01:57:52.966731 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:52.966586 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1000fd17-2fc7-4cc5-bdb8-15d3062b2f81-util\") pod \"1000fd17-2fc7-4cc5-bdb8-15d3062b2f81\" (UID: \"1000fd17-2fc7-4cc5-bdb8-15d3062b2f81\") " Apr 21 01:57:52.966802 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:52.966732 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x589l\" (UniqueName: \"kubernetes.io/projected/1000fd17-2fc7-4cc5-bdb8-15d3062b2f81-kube-api-access-x589l\") pod \"1000fd17-2fc7-4cc5-bdb8-15d3062b2f81\" (UID: \"1000fd17-2fc7-4cc5-bdb8-15d3062b2f81\") " Apr 21 01:57:52.967074 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:52.967044 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1000fd17-2fc7-4cc5-bdb8-15d3062b2f81-bundle" (OuterVolumeSpecName: "bundle") pod "1000fd17-2fc7-4cc5-bdb8-15d3062b2f81" (UID: "1000fd17-2fc7-4cc5-bdb8-15d3062b2f81"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:57:52.969288 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:52.969252 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1000fd17-2fc7-4cc5-bdb8-15d3062b2f81-kube-api-access-x589l" (OuterVolumeSpecName: "kube-api-access-x589l") pod "1000fd17-2fc7-4cc5-bdb8-15d3062b2f81" (UID: "1000fd17-2fc7-4cc5-bdb8-15d3062b2f81"). InnerVolumeSpecName "kube-api-access-x589l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:57:52.972162 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:52.972136 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1000fd17-2fc7-4cc5-bdb8-15d3062b2f81-util" (OuterVolumeSpecName: "util") pod "1000fd17-2fc7-4cc5-bdb8-15d3062b2f81" (UID: "1000fd17-2fc7-4cc5-bdb8-15d3062b2f81"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:57:53.067459 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:53.067425 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x589l\" (UniqueName: \"kubernetes.io/projected/1000fd17-2fc7-4cc5-bdb8-15d3062b2f81-kube-api-access-x589l\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:57:53.067459 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:53.067451 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1000fd17-2fc7-4cc5-bdb8-15d3062b2f81-bundle\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:57:53.067459 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:53.067462 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1000fd17-2fc7-4cc5-bdb8-15d3062b2f81-util\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:57:53.828466 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:53.828436 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f" event={"ID":"1000fd17-2fc7-4cc5-bdb8-15d3062b2f81","Type":"ContainerDied","Data":"b09661faf6e11a2c5a871b518f54754f4f179fc378c55d4e87fc4c3b13c19a69"} Apr 21 01:57:53.828466 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:53.828460 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f" Apr 21 01:57:53.828466 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:57:53.828470 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b09661faf6e11a2c5a871b518f54754f4f179fc378c55d4e87fc4c3b13c19a69" Apr 21 01:58:20.115715 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.115676 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfxt"] Apr 21 01:58:20.116328 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116206 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1000fd17-2fc7-4cc5-bdb8-15d3062b2f81" containerName="util" Apr 21 01:58:20.116328 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116226 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1000fd17-2fc7-4cc5-bdb8-15d3062b2f81" containerName="util" Apr 21 01:58:20.116328 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116236 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04d6ec82-73db-455a-856e-49cff06f126b" containerName="extract" Apr 21 01:58:20.116328 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116245 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d6ec82-73db-455a-856e-49cff06f126b" containerName="extract" Apr 21 01:58:20.116328 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116259 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04d6ec82-73db-455a-856e-49cff06f126b" containerName="pull" Apr 21 01:58:20.116328 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116267 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d6ec82-73db-455a-856e-49cff06f126b" containerName="pull" Apr 21 01:58:20.116328 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116279 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b88f3b10-9e35-4408-8341-e1cac20e0332" containerName="util" Apr 21 01:58:20.116328 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116287 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88f3b10-9e35-4408-8341-e1cac20e0332" containerName="util" Apr 21 01:58:20.116328 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116301 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f02c586b-d484-49c2-aa48-94fc19cf622b" containerName="util" Apr 21 01:58:20.116328 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116309 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f02c586b-d484-49c2-aa48-94fc19cf622b" containerName="util" Apr 21 01:58:20.116328 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116323 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f02c586b-d484-49c2-aa48-94fc19cf622b" containerName="extract" Apr 21 01:58:20.116328 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116330 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f02c586b-d484-49c2-aa48-94fc19cf622b" containerName="extract" Apr 21 01:58:20.116953 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116339 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1000fd17-2fc7-4cc5-bdb8-15d3062b2f81" containerName="pull" Apr 21 01:58:20.116953 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116347 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1000fd17-2fc7-4cc5-bdb8-15d3062b2f81" containerName="pull" Apr 21 01:58:20.116953 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116359 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b88f3b10-9e35-4408-8341-e1cac20e0332" containerName="extract" Apr 21 01:58:20.116953 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116367 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88f3b10-9e35-4408-8341-e1cac20e0332" containerName="extract" Apr 21 01:58:20.116953 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116384 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1000fd17-2fc7-4cc5-bdb8-15d3062b2f81" containerName="extract" Apr 21 01:58:20.116953 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116394 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1000fd17-2fc7-4cc5-bdb8-15d3062b2f81" containerName="extract" Apr 21 01:58:20.116953 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116406 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="04d6ec82-73db-455a-856e-49cff06f126b" containerName="util" Apr 21 01:58:20.116953 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116414 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d6ec82-73db-455a-856e-49cff06f126b" containerName="util" Apr 21 01:58:20.116953 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116426 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b88f3b10-9e35-4408-8341-e1cac20e0332" containerName="pull" Apr 21 01:58:20.116953 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116434 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88f3b10-9e35-4408-8341-e1cac20e0332" containerName="pull" Apr 21 01:58:20.116953 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116448 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f02c586b-d484-49c2-aa48-94fc19cf622b" containerName="pull" Apr 21 01:58:20.116953 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116456 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f02c586b-d484-49c2-aa48-94fc19cf622b" containerName="pull" Apr 21 01:58:20.116953 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116541 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="04d6ec82-73db-455a-856e-49cff06f126b" containerName="extract" Apr 21 01:58:20.116953 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116554 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f02c586b-d484-49c2-aa48-94fc19cf622b" containerName="extract" Apr 21 01:58:20.116953 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116565 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b88f3b10-9e35-4408-8341-e1cac20e0332" containerName="extract" Apr 21 01:58:20.116953 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.116580 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1000fd17-2fc7-4cc5-bdb8-15d3062b2f81" containerName="extract" Apr 21 01:58:20.119794 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.119775 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfxt" Apr 21 01:58:20.122886 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.122596 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 21 01:58:20.122886 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.122788 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-bmpqw\"" Apr 21 01:58:20.123607 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.123583 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 01:58:20.123716 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.123624 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 21 01:58:20.123716 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.123660 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 01:58:20.126597 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.126570 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfxt"] Apr 21 01:58:20.223187 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.223142 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0678367a-88d0-4159-a101-f1cb29cee691-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-klfxt\" (UID: \"0678367a-88d0-4159-a101-f1cb29cee691\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfxt" Apr 21 01:58:20.223397 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.223259 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gc24\" (UniqueName: \"kubernetes.io/projected/0678367a-88d0-4159-a101-f1cb29cee691-kube-api-access-5gc24\") pod \"kuadrant-console-plugin-6cb54b5c86-klfxt\" (UID: \"0678367a-88d0-4159-a101-f1cb29cee691\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfxt" Apr 21 01:58:20.223397 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.223377 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0678367a-88d0-4159-a101-f1cb29cee691-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-klfxt\" (UID: \"0678367a-88d0-4159-a101-f1cb29cee691\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfxt" Apr 21 01:58:20.324871 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.324826 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0678367a-88d0-4159-a101-f1cb29cee691-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-klfxt\" (UID: \"0678367a-88d0-4159-a101-f1cb29cee691\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfxt" Apr 21 01:58:20.325070 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.324911 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gc24\" (UniqueName: \"kubernetes.io/projected/0678367a-88d0-4159-a101-f1cb29cee691-kube-api-access-5gc24\") pod \"kuadrant-console-plugin-6cb54b5c86-klfxt\" (UID: \"0678367a-88d0-4159-a101-f1cb29cee691\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfxt" Apr 21 01:58:20.325070 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.324938 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0678367a-88d0-4159-a101-f1cb29cee691-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-klfxt\" (UID: \"0678367a-88d0-4159-a101-f1cb29cee691\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfxt" Apr 21 01:58:20.325070 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:58:20.325051 2573 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 21 01:58:20.325236 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:58:20.325114 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0678367a-88d0-4159-a101-f1cb29cee691-plugin-serving-cert podName:0678367a-88d0-4159-a101-f1cb29cee691 nodeName:}" failed. No retries permitted until 2026-04-21 01:58:20.825096361 +0000 UTC m=+498.404398397 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/0678367a-88d0-4159-a101-f1cb29cee691-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-klfxt" (UID: "0678367a-88d0-4159-a101-f1cb29cee691") : secret "plugin-serving-cert" not found Apr 21 01:58:20.325570 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.325539 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0678367a-88d0-4159-a101-f1cb29cee691-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-klfxt\" (UID: \"0678367a-88d0-4159-a101-f1cb29cee691\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfxt" Apr 21 01:58:20.333515 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.333480 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gc24\" (UniqueName: \"kubernetes.io/projected/0678367a-88d0-4159-a101-f1cb29cee691-kube-api-access-5gc24\") pod \"kuadrant-console-plugin-6cb54b5c86-klfxt\" (UID: \"0678367a-88d0-4159-a101-f1cb29cee691\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfxt" Apr 21 01:58:20.831164 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.831125 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0678367a-88d0-4159-a101-f1cb29cee691-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-klfxt\" (UID: \"0678367a-88d0-4159-a101-f1cb29cee691\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfxt" Apr 21 01:58:20.833606 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:20.833570 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/0678367a-88d0-4159-a101-f1cb29cee691-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-klfxt\" (UID: \"0678367a-88d0-4159-a101-f1cb29cee691\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfxt" Apr 21 01:58:21.031514 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:21.031473 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfxt" Apr 21 01:58:21.161570 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:21.161540 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfxt"] Apr 21 01:58:21.163314 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:58:21.163279 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0678367a_88d0_4159_a101_f1cb29cee691.slice/crio-51c07a564ea4e040f56d1589852c35579758c07bb5527cf4b60a6013ba03425d WatchSource:0}: Error finding container 51c07a564ea4e040f56d1589852c35579758c07bb5527cf4b60a6013ba03425d: Status 404 returned error can't find the container with id 51c07a564ea4e040f56d1589852c35579758c07bb5527cf4b60a6013ba03425d Apr 21 01:58:21.937631 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:21.937562 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfxt" event={"ID":"0678367a-88d0-4159-a101-f1cb29cee691","Type":"ContainerStarted","Data":"51c07a564ea4e040f56d1589852c35579758c07bb5527cf4b60a6013ba03425d"} Apr 21 01:58:46.045557 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:46.045514 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfxt" event={"ID":"0678367a-88d0-4159-a101-f1cb29cee691","Type":"ContainerStarted","Data":"fdeb68e8f72ad0b5da7544d9684438d51efff5d49ce5846ad0840be5ebe3de07"} Apr 21 01:58:46.064860 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:46.064784 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-klfxt" podStartSLOduration=1.654055703 podStartE2EDuration="26.064768932s" podCreationTimestamp="2026-04-21 01:58:20 +0000 UTC" firstStartedPulling="2026-04-21 01:58:21.1643571 +0000 UTC m=+498.743659140" lastFinishedPulling="2026-04-21 01:58:45.575070318 +0000 UTC m=+523.154372369" observedRunningTime="2026-04-21 01:58:46.063232292 +0000 UTC m=+523.642534353" watchObservedRunningTime="2026-04-21 01:58:46.064768932 +0000 UTC m=+523.644070989" Apr 21 01:58:59.715493 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.715401 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7f6957968c-9wh6s"] Apr 21 01:58:59.767688 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.767647 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f6957968c-9wh6s"] Apr 21 01:58:59.767896 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.767768 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:58:59.888045 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.887999 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9psm\" (UniqueName: \"kubernetes.io/projected/66283c5e-bf84-4eca-adf5-0e75883b19b2-kube-api-access-t9psm\") pod \"console-7f6957968c-9wh6s\" (UID: \"66283c5e-bf84-4eca-adf5-0e75883b19b2\") " pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:58:59.888045 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.888048 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/66283c5e-bf84-4eca-adf5-0e75883b19b2-service-ca\") pod \"console-7f6957968c-9wh6s\" (UID: \"66283c5e-bf84-4eca-adf5-0e75883b19b2\") " pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:58:59.888298 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.888071 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/66283c5e-bf84-4eca-adf5-0e75883b19b2-oauth-serving-cert\") pod \"console-7f6957968c-9wh6s\" (UID: \"66283c5e-bf84-4eca-adf5-0e75883b19b2\") " pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:58:59.888298 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.888102 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/66283c5e-bf84-4eca-adf5-0e75883b19b2-console-config\") pod \"console-7f6957968c-9wh6s\" (UID: \"66283c5e-bf84-4eca-adf5-0e75883b19b2\") " pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:58:59.888298 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.888138 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/66283c5e-bf84-4eca-adf5-0e75883b19b2-console-serving-cert\") pod \"console-7f6957968c-9wh6s\" (UID: \"66283c5e-bf84-4eca-adf5-0e75883b19b2\") " pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:58:59.888298 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.888175 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/66283c5e-bf84-4eca-adf5-0e75883b19b2-console-oauth-config\") pod \"console-7f6957968c-9wh6s\" (UID: \"66283c5e-bf84-4eca-adf5-0e75883b19b2\") " pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:58:59.888298 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.888193 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66283c5e-bf84-4eca-adf5-0e75883b19b2-trusted-ca-bundle\") pod \"console-7f6957968c-9wh6s\" (UID: \"66283c5e-bf84-4eca-adf5-0e75883b19b2\") " pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:58:59.989702 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.989614 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/66283c5e-bf84-4eca-adf5-0e75883b19b2-console-oauth-config\") pod \"console-7f6957968c-9wh6s\" (UID: \"66283c5e-bf84-4eca-adf5-0e75883b19b2\") " pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:58:59.989702 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.989647 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66283c5e-bf84-4eca-adf5-0e75883b19b2-trusted-ca-bundle\") pod \"console-7f6957968c-9wh6s\" (UID: \"66283c5e-bf84-4eca-adf5-0e75883b19b2\") " pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:58:59.989702 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.989702 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9psm\" (UniqueName: \"kubernetes.io/projected/66283c5e-bf84-4eca-adf5-0e75883b19b2-kube-api-access-t9psm\") pod \"console-7f6957968c-9wh6s\" (UID: \"66283c5e-bf84-4eca-adf5-0e75883b19b2\") " pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:58:59.990022 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.989722 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/66283c5e-bf84-4eca-adf5-0e75883b19b2-service-ca\") pod \"console-7f6957968c-9wh6s\" (UID: \"66283c5e-bf84-4eca-adf5-0e75883b19b2\") " pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:58:59.990022 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.989742 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/66283c5e-bf84-4eca-adf5-0e75883b19b2-oauth-serving-cert\") pod \"console-7f6957968c-9wh6s\" (UID: \"66283c5e-bf84-4eca-adf5-0e75883b19b2\") " pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:58:59.990022 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.989758 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/66283c5e-bf84-4eca-adf5-0e75883b19b2-console-config\") pod \"console-7f6957968c-9wh6s\" (UID: \"66283c5e-bf84-4eca-adf5-0e75883b19b2\") " pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:58:59.990022 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.989779 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/66283c5e-bf84-4eca-adf5-0e75883b19b2-console-serving-cert\") pod \"console-7f6957968c-9wh6s\" (UID: \"66283c5e-bf84-4eca-adf5-0e75883b19b2\") " pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:58:59.990639 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.990613 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/66283c5e-bf84-4eca-adf5-0e75883b19b2-service-ca\") pod \"console-7f6957968c-9wh6s\" (UID: \"66283c5e-bf84-4eca-adf5-0e75883b19b2\") " pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:58:59.990747 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.990636 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/66283c5e-bf84-4eca-adf5-0e75883b19b2-console-config\") pod \"console-7f6957968c-9wh6s\" (UID: \"66283c5e-bf84-4eca-adf5-0e75883b19b2\") " pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:58:59.990747 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.990636 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/66283c5e-bf84-4eca-adf5-0e75883b19b2-oauth-serving-cert\") pod \"console-7f6957968c-9wh6s\" (UID: \"66283c5e-bf84-4eca-adf5-0e75883b19b2\") " pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:58:59.990747 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.990669 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66283c5e-bf84-4eca-adf5-0e75883b19b2-trusted-ca-bundle\") pod \"console-7f6957968c-9wh6s\" (UID: \"66283c5e-bf84-4eca-adf5-0e75883b19b2\") " pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:58:59.992305 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.992284 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/66283c5e-bf84-4eca-adf5-0e75883b19b2-console-oauth-config\") pod \"console-7f6957968c-9wh6s\" (UID: \"66283c5e-bf84-4eca-adf5-0e75883b19b2\") " pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:58:59.992396 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.992335 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/66283c5e-bf84-4eca-adf5-0e75883b19b2-console-serving-cert\") pod \"console-7f6957968c-9wh6s\" (UID: \"66283c5e-bf84-4eca-adf5-0e75883b19b2\") " pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:58:59.997228 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:58:59.997204 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9psm\" (UniqueName: \"kubernetes.io/projected/66283c5e-bf84-4eca-adf5-0e75883b19b2-kube-api-access-t9psm\") pod \"console-7f6957968c-9wh6s\" (UID: \"66283c5e-bf84-4eca-adf5-0e75883b19b2\") " pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:59:00.079999 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:00.079956 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:59:00.204845 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:00.204797 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f6957968c-9wh6s"] Apr 21 01:59:00.206848 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:59:00.206797 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66283c5e_bf84_4eca_adf5_0e75883b19b2.slice/crio-bea84dc3f7d981d06326f2b8e4151206a3da81866683dbb0ec926b0833b8d346 WatchSource:0}: Error finding container bea84dc3f7d981d06326f2b8e4151206a3da81866683dbb0ec926b0833b8d346: Status 404 returned error can't find the container with id bea84dc3f7d981d06326f2b8e4151206a3da81866683dbb0ec926b0833b8d346 Apr 21 01:59:01.103156 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:01.103121 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f6957968c-9wh6s" event={"ID":"66283c5e-bf84-4eca-adf5-0e75883b19b2","Type":"ContainerStarted","Data":"51fd716d8a53ba1c520e03787610970c0211ae3cfae880f768e59dfdd02b781a"} Apr 21 01:59:01.103156 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:01.103159 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f6957968c-9wh6s" event={"ID":"66283c5e-bf84-4eca-adf5-0e75883b19b2","Type":"ContainerStarted","Data":"bea84dc3f7d981d06326f2b8e4151206a3da81866683dbb0ec926b0833b8d346"} Apr 21 01:59:01.120684 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:01.120628 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7f6957968c-9wh6s" podStartSLOduration=2.120613296 podStartE2EDuration="2.120613296s" podCreationTimestamp="2026-04-21 01:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 01:59:01.11900931 +0000 UTC m=+538.698311373" watchObservedRunningTime="2026-04-21 01:59:01.120613296 +0000 UTC m=+538.699915352" Apr 21 01:59:10.080399 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:10.080336 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:59:10.080399 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:10.080404 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:59:10.085110 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:10.085085 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:59:10.139340 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:10.139310 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7f6957968c-9wh6s" Apr 21 01:59:10.182422 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:10.182385 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d8db5cdb6-rcp5r"] Apr 21 01:59:14.777491 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:14.777447 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6pbq9"] Apr 21 01:59:14.995029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:14.994984 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6pbq9"] Apr 21 01:59:14.995029 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:14.995020 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6pbq9"] Apr 21 01:59:14.995244 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:14.995139 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-6pbq9" Apr 21 01:59:14.997836 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:14.997794 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 21 01:59:15.127264 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:15.127231 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/acc755ef-95d8-4417-b574-be065be2272b-config-file\") pod \"limitador-limitador-78c99df468-6pbq9\" (UID: \"acc755ef-95d8-4417-b574-be065be2272b\") " pod="kuadrant-system/limitador-limitador-78c99df468-6pbq9" Apr 21 01:59:15.127423 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:15.127291 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4tbf\" (UniqueName: \"kubernetes.io/projected/acc755ef-95d8-4417-b574-be065be2272b-kube-api-access-v4tbf\") pod \"limitador-limitador-78c99df468-6pbq9\" (UID: \"acc755ef-95d8-4417-b574-be065be2272b\") " pod="kuadrant-system/limitador-limitador-78c99df468-6pbq9" Apr 21 01:59:15.228633 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:15.228599 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4tbf\" (UniqueName: \"kubernetes.io/projected/acc755ef-95d8-4417-b574-be065be2272b-kube-api-access-v4tbf\") pod \"limitador-limitador-78c99df468-6pbq9\" (UID: \"acc755ef-95d8-4417-b574-be065be2272b\") " pod="kuadrant-system/limitador-limitador-78c99df468-6pbq9" Apr 21 01:59:15.228855 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:15.228695 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/acc755ef-95d8-4417-b574-be065be2272b-config-file\") pod \"limitador-limitador-78c99df468-6pbq9\" (UID: \"acc755ef-95d8-4417-b574-be065be2272b\") " pod="kuadrant-system/limitador-limitador-78c99df468-6pbq9" Apr 21 01:59:15.229258 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:15.229240 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/acc755ef-95d8-4417-b574-be065be2272b-config-file\") pod \"limitador-limitador-78c99df468-6pbq9\" (UID: \"acc755ef-95d8-4417-b574-be065be2272b\") " pod="kuadrant-system/limitador-limitador-78c99df468-6pbq9" Apr 21 01:59:15.238890 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:15.238859 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4tbf\" (UniqueName: \"kubernetes.io/projected/acc755ef-95d8-4417-b574-be065be2272b-kube-api-access-v4tbf\") pod \"limitador-limitador-78c99df468-6pbq9\" (UID: \"acc755ef-95d8-4417-b574-be065be2272b\") " pod="kuadrant-system/limitador-limitador-78c99df468-6pbq9" Apr 21 01:59:15.305221 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:15.305183 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-6pbq9" Apr 21 01:59:15.423840 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:15.423796 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-dtbwv"] Apr 21 01:59:15.436604 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:15.436575 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-dtbwv"] Apr 21 01:59:15.436712 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:15.436609 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6pbq9"] Apr 21 01:59:15.436768 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:15.436702 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-dtbwv" Apr 21 01:59:15.438460 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:59:15.438421 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacc755ef_95d8_4417_b574_be065be2272b.slice/crio-c205ac7b444789ddbdc3f5c7d483b533d77ed62049a19bd18f5225d9a1e692a1 WatchSource:0}: Error finding container c205ac7b444789ddbdc3f5c7d483b533d77ed62049a19bd18f5225d9a1e692a1: Status 404 returned error can't find the container with id c205ac7b444789ddbdc3f5c7d483b533d77ed62049a19bd18f5225d9a1e692a1 Apr 21 01:59:15.439033 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:15.439013 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-lzqzq\"" Apr 21 01:59:15.632477 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:15.632443 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sslf6\" (UniqueName: \"kubernetes.io/projected/32050bca-f4f1-4659-a1e0-79a9ee9234c4-kube-api-access-sslf6\") pod \"authorino-7498df8756-dtbwv\" (UID: \"32050bca-f4f1-4659-a1e0-79a9ee9234c4\") " pod="kuadrant-system/authorino-7498df8756-dtbwv" Apr 21 01:59:15.733950 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:15.733860 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sslf6\" (UniqueName: \"kubernetes.io/projected/32050bca-f4f1-4659-a1e0-79a9ee9234c4-kube-api-access-sslf6\") pod \"authorino-7498df8756-dtbwv\" (UID: \"32050bca-f4f1-4659-a1e0-79a9ee9234c4\") " pod="kuadrant-system/authorino-7498df8756-dtbwv" Apr 21 01:59:15.741839 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:15.741798 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sslf6\" (UniqueName: \"kubernetes.io/projected/32050bca-f4f1-4659-a1e0-79a9ee9234c4-kube-api-access-sslf6\") pod \"authorino-7498df8756-dtbwv\" (UID: \"32050bca-f4f1-4659-a1e0-79a9ee9234c4\") " pod="kuadrant-system/authorino-7498df8756-dtbwv" Apr 21 01:59:15.754646 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:15.754604 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-dtbwv" Apr 21 01:59:15.892202 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:15.891286 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-dtbwv"] Apr 21 01:59:15.893957 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:59:15.893923 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32050bca_f4f1_4659_a1e0_79a9ee9234c4.slice/crio-1beda11eeb08d57830dffb4aba5cf006116b90cffcbceb6209406a8dd2c742df WatchSource:0}: Error finding container 1beda11eeb08d57830dffb4aba5cf006116b90cffcbceb6209406a8dd2c742df: Status 404 returned error can't find the container with id 1beda11eeb08d57830dffb4aba5cf006116b90cffcbceb6209406a8dd2c742df Apr 21 01:59:16.163724 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:16.163683 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-6pbq9" event={"ID":"acc755ef-95d8-4417-b574-be065be2272b","Type":"ContainerStarted","Data":"c205ac7b444789ddbdc3f5c7d483b533d77ed62049a19bd18f5225d9a1e692a1"} Apr 21 01:59:16.164940 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:16.164899 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-dtbwv" event={"ID":"32050bca-f4f1-4659-a1e0-79a9ee9234c4","Type":"ContainerStarted","Data":"1beda11eeb08d57830dffb4aba5cf006116b90cffcbceb6209406a8dd2c742df"} Apr 21 01:59:20.186271 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:20.186218 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-6pbq9" event={"ID":"acc755ef-95d8-4417-b574-be065be2272b","Type":"ContainerStarted","Data":"1788ec7bd97efb00ab5f9011102456efdcc8511870e6681db181251d88d6068a"} Apr 21 01:59:20.186271 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:20.186274 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-6pbq9" Apr 21 01:59:20.187457 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:20.187434 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-dtbwv" event={"ID":"32050bca-f4f1-4659-a1e0-79a9ee9234c4","Type":"ContainerStarted","Data":"e5547bd1ab38220bb3bb0e05db5a2ee0a77d92217f92f130b87244971afa339b"} Apr 21 01:59:20.202965 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:20.202924 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-6pbq9" podStartSLOduration=1.879990851 podStartE2EDuration="6.202909899s" podCreationTimestamp="2026-04-21 01:59:14 +0000 UTC" firstStartedPulling="2026-04-21 01:59:15.440442951 +0000 UTC m=+553.019744990" lastFinishedPulling="2026-04-21 01:59:19.763361998 +0000 UTC m=+557.342664038" observedRunningTime="2026-04-21 01:59:20.200155116 +0000 UTC m=+557.779457423" watchObservedRunningTime="2026-04-21 01:59:20.202909899 +0000 UTC m=+557.782211956" Apr 21 01:59:20.213439 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:20.213393 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-dtbwv" podStartSLOduration=1.345468183 podStartE2EDuration="5.213380205s" podCreationTimestamp="2026-04-21 01:59:15 +0000 UTC" firstStartedPulling="2026-04-21 01:59:15.896603631 +0000 UTC m=+553.475905671" lastFinishedPulling="2026-04-21 01:59:19.764515654 +0000 UTC m=+557.343817693" observedRunningTime="2026-04-21 01:59:20.211917422 +0000 UTC m=+557.791219510" watchObservedRunningTime="2026-04-21 01:59:20.213380205 +0000 UTC m=+557.792682262" Apr 21 01:59:31.192854 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:31.192794 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-6pbq9" Apr 21 01:59:35.203448 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.203384 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7d8db5cdb6-rcp5r" podUID="fd81e18c-aea5-40b0-aff0-eec1085bb796" containerName="console" containerID="cri-o://e1e30f6851812fe4c5d0ffea00284a69c3209c3645fa06c1914a02196c44b0df" gracePeriod=15 Apr 21 01:59:35.451429 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.451407 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d8db5cdb6-rcp5r_fd81e18c-aea5-40b0-aff0-eec1085bb796/console/0.log" Apr 21 01:59:35.451566 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.451484 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:59:35.495886 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.495774 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-oauth-serving-cert\") pod \"fd81e18c-aea5-40b0-aff0-eec1085bb796\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " Apr 21 01:59:35.495886 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.495826 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd81e18c-aea5-40b0-aff0-eec1085bb796-console-serving-cert\") pod \"fd81e18c-aea5-40b0-aff0-eec1085bb796\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " Apr 21 01:59:35.496091 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.495893 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grm4k\" (UniqueName: \"kubernetes.io/projected/fd81e18c-aea5-40b0-aff0-eec1085bb796-kube-api-access-grm4k\") pod \"fd81e18c-aea5-40b0-aff0-eec1085bb796\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " Apr 21 01:59:35.496091 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.495917 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-service-ca\") pod \"fd81e18c-aea5-40b0-aff0-eec1085bb796\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " Apr 21 01:59:35.496091 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.495958 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fd81e18c-aea5-40b0-aff0-eec1085bb796-console-oauth-config\") pod \"fd81e18c-aea5-40b0-aff0-eec1085bb796\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " Apr 21 01:59:35.496091 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.495987 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-trusted-ca-bundle\") pod \"fd81e18c-aea5-40b0-aff0-eec1085bb796\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " Apr 21 01:59:35.496091 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.496014 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-console-config\") pod \"fd81e18c-aea5-40b0-aff0-eec1085bb796\" (UID: \"fd81e18c-aea5-40b0-aff0-eec1085bb796\") " Apr 21 01:59:35.496361 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.496326 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "fd81e18c-aea5-40b0-aff0-eec1085bb796" (UID: "fd81e18c-aea5-40b0-aff0-eec1085bb796"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:59:35.496423 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.496379 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-service-ca" (OuterVolumeSpecName: "service-ca") pod "fd81e18c-aea5-40b0-aff0-eec1085bb796" (UID: "fd81e18c-aea5-40b0-aff0-eec1085bb796"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:59:35.496483 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.496462 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-console-config" (OuterVolumeSpecName: "console-config") pod "fd81e18c-aea5-40b0-aff0-eec1085bb796" (UID: "fd81e18c-aea5-40b0-aff0-eec1085bb796"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:59:35.496721 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.496695 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "fd81e18c-aea5-40b0-aff0-eec1085bb796" (UID: "fd81e18c-aea5-40b0-aff0-eec1085bb796"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 01:59:35.498174 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.498146 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd81e18c-aea5-40b0-aff0-eec1085bb796-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "fd81e18c-aea5-40b0-aff0-eec1085bb796" (UID: "fd81e18c-aea5-40b0-aff0-eec1085bb796"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 01:59:35.498271 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.498189 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd81e18c-aea5-40b0-aff0-eec1085bb796-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "fd81e18c-aea5-40b0-aff0-eec1085bb796" (UID: "fd81e18c-aea5-40b0-aff0-eec1085bb796"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 01:59:35.498271 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.498248 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd81e18c-aea5-40b0-aff0-eec1085bb796-kube-api-access-grm4k" (OuterVolumeSpecName: "kube-api-access-grm4k") pod "fd81e18c-aea5-40b0-aff0-eec1085bb796" (UID: "fd81e18c-aea5-40b0-aff0-eec1085bb796"). InnerVolumeSpecName "kube-api-access-grm4k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:59:35.597266 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.597226 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-oauth-serving-cert\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:59:35.597266 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.597256 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd81e18c-aea5-40b0-aff0-eec1085bb796-console-serving-cert\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:59:35.597266 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.597266 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-grm4k\" (UniqueName: \"kubernetes.io/projected/fd81e18c-aea5-40b0-aff0-eec1085bb796-kube-api-access-grm4k\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:59:35.597498 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.597277 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-service-ca\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:59:35.597498 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.597286 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fd81e18c-aea5-40b0-aff0-eec1085bb796-console-oauth-config\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:59:35.597498 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.597309 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-trusted-ca-bundle\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:59:35.597498 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:35.597320 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fd81e18c-aea5-40b0-aff0-eec1085bb796-console-config\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:59:36.250443 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:36.250416 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d8db5cdb6-rcp5r_fd81e18c-aea5-40b0-aff0-eec1085bb796/console/0.log" Apr 21 01:59:36.250878 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:36.250460 2573 generic.go:358] "Generic (PLEG): container finished" podID="fd81e18c-aea5-40b0-aff0-eec1085bb796" containerID="e1e30f6851812fe4c5d0ffea00284a69c3209c3645fa06c1914a02196c44b0df" exitCode=2 Apr 21 01:59:36.250878 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:36.250499 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d8db5cdb6-rcp5r" event={"ID":"fd81e18c-aea5-40b0-aff0-eec1085bb796","Type":"ContainerDied","Data":"e1e30f6851812fe4c5d0ffea00284a69c3209c3645fa06c1914a02196c44b0df"} Apr 21 01:59:36.250878 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:36.250532 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d8db5cdb6-rcp5r" Apr 21 01:59:36.250878 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:36.250543 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d8db5cdb6-rcp5r" event={"ID":"fd81e18c-aea5-40b0-aff0-eec1085bb796","Type":"ContainerDied","Data":"7464f4ff54dce42e5b2f5ae767013407aaa781aec2c9e28e7c08020ba1da5d3a"} Apr 21 01:59:36.250878 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:36.250559 2573 scope.go:117] "RemoveContainer" containerID="e1e30f6851812fe4c5d0ffea00284a69c3209c3645fa06c1914a02196c44b0df" Apr 21 01:59:36.260028 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:36.260005 2573 scope.go:117] "RemoveContainer" containerID="e1e30f6851812fe4c5d0ffea00284a69c3209c3645fa06c1914a02196c44b0df" Apr 21 01:59:36.260283 ip-10-0-129-52 kubenswrapper[2573]: E0421 01:59:36.260264 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1e30f6851812fe4c5d0ffea00284a69c3209c3645fa06c1914a02196c44b0df\": container with ID starting with e1e30f6851812fe4c5d0ffea00284a69c3209c3645fa06c1914a02196c44b0df not found: ID does not exist" containerID="e1e30f6851812fe4c5d0ffea00284a69c3209c3645fa06c1914a02196c44b0df" Apr 21 01:59:36.260343 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:36.260296 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e30f6851812fe4c5d0ffea00284a69c3209c3645fa06c1914a02196c44b0df"} err="failed to get container status \"e1e30f6851812fe4c5d0ffea00284a69c3209c3645fa06c1914a02196c44b0df\": rpc error: code = NotFound desc = could not find container \"e1e30f6851812fe4c5d0ffea00284a69c3209c3645fa06c1914a02196c44b0df\": container with ID starting with e1e30f6851812fe4c5d0ffea00284a69c3209c3645fa06c1914a02196c44b0df not found: ID does not exist" Apr 21 01:59:36.273904 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:36.273875 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d8db5cdb6-rcp5r"] Apr 21 01:59:36.277551 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:36.277527 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7d8db5cdb6-rcp5r"] Apr 21 01:59:37.064869 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:37.064805 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd81e18c-aea5-40b0-aff0-eec1085bb796" path="/var/lib/kubelet/pods/fd81e18c-aea5-40b0-aff0-eec1085bb796/volumes" Apr 21 01:59:54.135477 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:54.135438 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb"] Apr 21 01:59:54.135866 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:54.135827 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd81e18c-aea5-40b0-aff0-eec1085bb796" containerName="console" Apr 21 01:59:54.135866 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:54.135839 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd81e18c-aea5-40b0-aff0-eec1085bb796" containerName="console" Apr 21 01:59:54.135953 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:54.135896 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd81e18c-aea5-40b0-aff0-eec1085bb796" containerName="console" Apr 21 01:59:54.146480 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:54.146458 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb" Apr 21 01:59:54.146480 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:54.146470 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb"] Apr 21 01:59:54.149254 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:54.149231 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-gj698\"" Apr 21 01:59:54.149383 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:54.149268 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 01:59:54.150238 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:54.150221 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 01:59:54.275883 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:54.275846 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3047e8f4-95ca-4d3a-b8d9-8e8011242bd3-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb\" (UID: \"3047e8f4-95ca-4d3a-b8d9-8e8011242bd3\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb" Apr 21 01:59:54.276063 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:54.275951 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3047e8f4-95ca-4d3a-b8d9-8e8011242bd3-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb\" (UID: \"3047e8f4-95ca-4d3a-b8d9-8e8011242bd3\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb" Apr 21 01:59:54.276063 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:54.275990 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czjw5\" (UniqueName: \"kubernetes.io/projected/3047e8f4-95ca-4d3a-b8d9-8e8011242bd3-kube-api-access-czjw5\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb\" (UID: \"3047e8f4-95ca-4d3a-b8d9-8e8011242bd3\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb" Apr 21 01:59:54.376753 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:54.376714 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3047e8f4-95ca-4d3a-b8d9-8e8011242bd3-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb\" (UID: \"3047e8f4-95ca-4d3a-b8d9-8e8011242bd3\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb" Apr 21 01:59:54.376968 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:54.376781 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3047e8f4-95ca-4d3a-b8d9-8e8011242bd3-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb\" (UID: \"3047e8f4-95ca-4d3a-b8d9-8e8011242bd3\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb" Apr 21 01:59:54.376968 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:54.376809 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czjw5\" (UniqueName: \"kubernetes.io/projected/3047e8f4-95ca-4d3a-b8d9-8e8011242bd3-kube-api-access-czjw5\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb\" (UID: \"3047e8f4-95ca-4d3a-b8d9-8e8011242bd3\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb" Apr 21 01:59:54.377156 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:54.377136 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3047e8f4-95ca-4d3a-b8d9-8e8011242bd3-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb\" (UID: \"3047e8f4-95ca-4d3a-b8d9-8e8011242bd3\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb" Apr 21 01:59:54.377207 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:54.377165 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3047e8f4-95ca-4d3a-b8d9-8e8011242bd3-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb\" (UID: \"3047e8f4-95ca-4d3a-b8d9-8e8011242bd3\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb" Apr 21 01:59:54.384313 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:54.384286 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czjw5\" (UniqueName: \"kubernetes.io/projected/3047e8f4-95ca-4d3a-b8d9-8e8011242bd3-kube-api-access-czjw5\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb\" (UID: \"3047e8f4-95ca-4d3a-b8d9-8e8011242bd3\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb" Apr 21 01:59:54.457016 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:54.456920 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb" Apr 21 01:59:54.583501 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:54.583471 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb"] Apr 21 01:59:54.584875 ip-10-0-129-52 kubenswrapper[2573]: W0421 01:59:54.584843 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3047e8f4_95ca_4d3a_b8d9_8e8011242bd3.slice/crio-97ba17fbb697b31be6ed6fb5a2be231a82b88c6b7e7c9503d57635b996797e40 WatchSource:0}: Error finding container 97ba17fbb697b31be6ed6fb5a2be231a82b88c6b7e7c9503d57635b996797e40: Status 404 returned error can't find the container with id 97ba17fbb697b31be6ed6fb5a2be231a82b88c6b7e7c9503d57635b996797e40 Apr 21 01:59:55.325540 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:55.325504 2573 generic.go:358] "Generic (PLEG): container finished" podID="3047e8f4-95ca-4d3a-b8d9-8e8011242bd3" containerID="30bd7391ef7088b9c7869bad28ea30742d16e83e669f9171e09c3dda6dc72220" exitCode=0 Apr 21 01:59:55.325967 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:55.325600 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb" event={"ID":"3047e8f4-95ca-4d3a-b8d9-8e8011242bd3","Type":"ContainerDied","Data":"30bd7391ef7088b9c7869bad28ea30742d16e83e669f9171e09c3dda6dc72220"} Apr 21 01:59:55.325967 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:55.325643 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb" event={"ID":"3047e8f4-95ca-4d3a-b8d9-8e8011242bd3","Type":"ContainerStarted","Data":"97ba17fbb697b31be6ed6fb5a2be231a82b88c6b7e7c9503d57635b996797e40"} Apr 21 01:59:56.331043 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:56.330949 2573 generic.go:358] "Generic (PLEG): container finished" podID="3047e8f4-95ca-4d3a-b8d9-8e8011242bd3" containerID="71e5f71653af83d3bf8de0fd75bbfe78fbe48a8d577c03805e1c2aa28634097a" exitCode=0 Apr 21 01:59:56.331496 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:56.331033 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb" event={"ID":"3047e8f4-95ca-4d3a-b8d9-8e8011242bd3","Type":"ContainerDied","Data":"71e5f71653af83d3bf8de0fd75bbfe78fbe48a8d577c03805e1c2aa28634097a"} Apr 21 01:59:57.337208 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:57.337171 2573 generic.go:358] "Generic (PLEG): container finished" podID="3047e8f4-95ca-4d3a-b8d9-8e8011242bd3" containerID="5e682709424239a88eaa72b34561d6ac3988bd772c0e35938708e0b569dcbbcc" exitCode=0 Apr 21 01:59:57.337582 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:57.337226 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb" event={"ID":"3047e8f4-95ca-4d3a-b8d9-8e8011242bd3","Type":"ContainerDied","Data":"5e682709424239a88eaa72b34561d6ac3988bd772c0e35938708e0b569dcbbcc"} Apr 21 01:59:58.469730 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:58.469704 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb" Apr 21 01:59:58.619653 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:58.619567 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3047e8f4-95ca-4d3a-b8d9-8e8011242bd3-bundle\") pod \"3047e8f4-95ca-4d3a-b8d9-8e8011242bd3\" (UID: \"3047e8f4-95ca-4d3a-b8d9-8e8011242bd3\") " Apr 21 01:59:58.619840 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:58.619661 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czjw5\" (UniqueName: \"kubernetes.io/projected/3047e8f4-95ca-4d3a-b8d9-8e8011242bd3-kube-api-access-czjw5\") pod \"3047e8f4-95ca-4d3a-b8d9-8e8011242bd3\" (UID: \"3047e8f4-95ca-4d3a-b8d9-8e8011242bd3\") " Apr 21 01:59:58.619840 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:58.619693 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3047e8f4-95ca-4d3a-b8d9-8e8011242bd3-util\") pod \"3047e8f4-95ca-4d3a-b8d9-8e8011242bd3\" (UID: \"3047e8f4-95ca-4d3a-b8d9-8e8011242bd3\") " Apr 21 01:59:58.620111 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:58.620087 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3047e8f4-95ca-4d3a-b8d9-8e8011242bd3-bundle" (OuterVolumeSpecName: "bundle") pod "3047e8f4-95ca-4d3a-b8d9-8e8011242bd3" (UID: "3047e8f4-95ca-4d3a-b8d9-8e8011242bd3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:59:58.621760 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:58.621738 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3047e8f4-95ca-4d3a-b8d9-8e8011242bd3-kube-api-access-czjw5" (OuterVolumeSpecName: "kube-api-access-czjw5") pod "3047e8f4-95ca-4d3a-b8d9-8e8011242bd3" (UID: "3047e8f4-95ca-4d3a-b8d9-8e8011242bd3"). InnerVolumeSpecName "kube-api-access-czjw5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 01:59:58.628121 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:58.628090 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3047e8f4-95ca-4d3a-b8d9-8e8011242bd3-util" (OuterVolumeSpecName: "util") pod "3047e8f4-95ca-4d3a-b8d9-8e8011242bd3" (UID: "3047e8f4-95ca-4d3a-b8d9-8e8011242bd3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 01:59:58.720426 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:58.720391 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3047e8f4-95ca-4d3a-b8d9-8e8011242bd3-bundle\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:59:58.720426 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:58.720421 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-czjw5\" (UniqueName: \"kubernetes.io/projected/3047e8f4-95ca-4d3a-b8d9-8e8011242bd3-kube-api-access-czjw5\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:59:58.720426 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:58.720434 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3047e8f4-95ca-4d3a-b8d9-8e8011242bd3-util\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 01:59:59.347538 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:59.347461 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb" Apr 21 01:59:59.347690 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:59.347456 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13507ngzb" event={"ID":"3047e8f4-95ca-4d3a-b8d9-8e8011242bd3","Type":"ContainerDied","Data":"97ba17fbb697b31be6ed6fb5a2be231a82b88c6b7e7c9503d57635b996797e40"} Apr 21 01:59:59.347690 ip-10-0-129-52 kubenswrapper[2573]: I0421 01:59:59.347572 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97ba17fbb697b31be6ed6fb5a2be231a82b88c6b7e7c9503d57635b996797e40" Apr 21 02:00:02.983387 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:02.983355 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2tdx_ed377958-ce5b-41c7-9512-4b95b799767d/ovn-acl-logging/0.log" Apr 21 02:00:02.984966 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:02.984943 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2tdx_ed377958-ce5b-41c7-9512-4b95b799767d/ovn-acl-logging/0.log" Apr 21 02:00:46.463265 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.463227 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-t7wk6"] Apr 21 02:00:46.463697 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.463595 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3047e8f4-95ca-4d3a-b8d9-8e8011242bd3" containerName="pull" Apr 21 02:00:46.463697 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.463606 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3047e8f4-95ca-4d3a-b8d9-8e8011242bd3" containerName="pull" Apr 21 02:00:46.463697 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.463614 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3047e8f4-95ca-4d3a-b8d9-8e8011242bd3" containerName="util" Apr 21 02:00:46.463697 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.463619 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3047e8f4-95ca-4d3a-b8d9-8e8011242bd3" containerName="util" Apr 21 02:00:46.463697 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.463629 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3047e8f4-95ca-4d3a-b8d9-8e8011242bd3" containerName="extract" Apr 21 02:00:46.463697 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.463635 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3047e8f4-95ca-4d3a-b8d9-8e8011242bd3" containerName="extract" Apr 21 02:00:46.463910 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.463707 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3047e8f4-95ca-4d3a-b8d9-8e8011242bd3" containerName="extract" Apr 21 02:00:46.466799 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.466782 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-t7wk6" Apr 21 02:00:46.472000 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.471975 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-t7wk6"] Apr 21 02:00:46.553490 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.553460 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt56p\" (UniqueName: \"kubernetes.io/projected/c72d24f1-ff60-4d5b-a35b-fcb2b191df2d-kube-api-access-xt56p\") pod \"authorino-8b475cf9f-t7wk6\" (UID: \"c72d24f1-ff60-4d5b-a35b-fcb2b191df2d\") " pod="kuadrant-system/authorino-8b475cf9f-t7wk6" Apr 21 02:00:46.654366 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.654328 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt56p\" (UniqueName: \"kubernetes.io/projected/c72d24f1-ff60-4d5b-a35b-fcb2b191df2d-kube-api-access-xt56p\") pod \"authorino-8b475cf9f-t7wk6\" (UID: \"c72d24f1-ff60-4d5b-a35b-fcb2b191df2d\") " pod="kuadrant-system/authorino-8b475cf9f-t7wk6" Apr 21 02:00:46.661766 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.661733 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt56p\" (UniqueName: \"kubernetes.io/projected/c72d24f1-ff60-4d5b-a35b-fcb2b191df2d-kube-api-access-xt56p\") pod \"authorino-8b475cf9f-t7wk6\" (UID: \"c72d24f1-ff60-4d5b-a35b-fcb2b191df2d\") " pod="kuadrant-system/authorino-8b475cf9f-t7wk6" Apr 21 02:00:46.700887 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.700849 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-t7wk6"] Apr 21 02:00:46.701123 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.701112 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-t7wk6" Apr 21 02:00:46.725172 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.725097 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5dfb5487b7-jpcw6"] Apr 21 02:00:46.729797 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.729750 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5dfb5487b7-jpcw6" Apr 21 02:00:46.732848 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.732798 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5dfb5487b7-jpcw6"] Apr 21 02:00:46.829977 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.829951 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-t7wk6"] Apr 21 02:00:46.831988 ip-10-0-129-52 kubenswrapper[2573]: W0421 02:00:46.831960 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc72d24f1_ff60_4d5b_a35b_fcb2b191df2d.slice/crio-4cbacad1d9b316f202c045b7f432e2909443ff5a89dd80870ef51957dce8a010 WatchSource:0}: Error finding container 4cbacad1d9b316f202c045b7f432e2909443ff5a89dd80870ef51957dce8a010: Status 404 returned error can't find the container with id 4cbacad1d9b316f202c045b7f432e2909443ff5a89dd80870ef51957dce8a010 Apr 21 02:00:46.833325 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.833307 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 02:00:46.856434 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.856398 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvqxh\" (UniqueName: \"kubernetes.io/projected/f0c7aafb-c232-48c8-b928-404bda44dfbb-kube-api-access-wvqxh\") pod \"authorino-5dfb5487b7-jpcw6\" (UID: \"f0c7aafb-c232-48c8-b928-404bda44dfbb\") " pod="kuadrant-system/authorino-5dfb5487b7-jpcw6" Apr 21 02:00:46.957043 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.957000 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvqxh\" (UniqueName: \"kubernetes.io/projected/f0c7aafb-c232-48c8-b928-404bda44dfbb-kube-api-access-wvqxh\") pod \"authorino-5dfb5487b7-jpcw6\" (UID: \"f0c7aafb-c232-48c8-b928-404bda44dfbb\") " pod="kuadrant-system/authorino-5dfb5487b7-jpcw6" Apr 21 02:00:46.964711 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.964681 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvqxh\" (UniqueName: \"kubernetes.io/projected/f0c7aafb-c232-48c8-b928-404bda44dfbb-kube-api-access-wvqxh\") pod \"authorino-5dfb5487b7-jpcw6\" (UID: \"f0c7aafb-c232-48c8-b928-404bda44dfbb\") " pod="kuadrant-system/authorino-5dfb5487b7-jpcw6" Apr 21 02:00:46.998294 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.998197 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5dfb5487b7-jpcw6"] Apr 21 02:00:46.998528 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:46.998511 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5dfb5487b7-jpcw6" Apr 21 02:00:47.123553 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:47.123529 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5dfb5487b7-jpcw6"] Apr 21 02:00:47.124921 ip-10-0-129-52 kubenswrapper[2573]: W0421 02:00:47.124892 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0c7aafb_c232_48c8_b928_404bda44dfbb.slice/crio-f41015fdd1fb6c5149d37684ed8f2fc97707597c027858d0332a583489a1cf25 WatchSource:0}: Error finding container f41015fdd1fb6c5149d37684ed8f2fc97707597c027858d0332a583489a1cf25: Status 404 returned error can't find the container with id f41015fdd1fb6c5149d37684ed8f2fc97707597c027858d0332a583489a1cf25 Apr 21 02:00:47.548236 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:47.548198 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-t7wk6" event={"ID":"c72d24f1-ff60-4d5b-a35b-fcb2b191df2d","Type":"ContainerStarted","Data":"ca66c4ae1670a6e9ce829b78de9735c562d21be43435e73d22bae462dacf9f93"} Apr 21 02:00:47.548236 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:47.548235 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-t7wk6" event={"ID":"c72d24f1-ff60-4d5b-a35b-fcb2b191df2d","Type":"ContainerStarted","Data":"4cbacad1d9b316f202c045b7f432e2909443ff5a89dd80870ef51957dce8a010"} Apr 21 02:00:47.548668 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:47.548250 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-t7wk6" podUID="c72d24f1-ff60-4d5b-a35b-fcb2b191df2d" containerName="authorino" containerID="cri-o://ca66c4ae1670a6e9ce829b78de9735c562d21be43435e73d22bae462dacf9f93" gracePeriod=30 Apr 21 02:00:47.549933 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:47.549909 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5dfb5487b7-jpcw6" event={"ID":"f0c7aafb-c232-48c8-b928-404bda44dfbb","Type":"ContainerStarted","Data":"2a6ac3e2be9e1077438e63c680b2c1b743ed69e005073d645e49b98d0ea584fd"} Apr 21 02:00:47.549933 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:47.549941 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5dfb5487b7-jpcw6" event={"ID":"f0c7aafb-c232-48c8-b928-404bda44dfbb","Type":"ContainerStarted","Data":"f41015fdd1fb6c5149d37684ed8f2fc97707597c027858d0332a583489a1cf25"} Apr 21 02:00:47.550091 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:47.549977 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-5dfb5487b7-jpcw6" podUID="f0c7aafb-c232-48c8-b928-404bda44dfbb" containerName="authorino" containerID="cri-o://2a6ac3e2be9e1077438e63c680b2c1b743ed69e005073d645e49b98d0ea584fd" gracePeriod=30 Apr 21 02:00:47.562329 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:47.562287 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-t7wk6" podStartSLOduration=1.145776933 podStartE2EDuration="1.562271867s" podCreationTimestamp="2026-04-21 02:00:46 +0000 UTC" firstStartedPulling="2026-04-21 02:00:46.833426818 +0000 UTC m=+644.412728853" lastFinishedPulling="2026-04-21 02:00:47.249921747 +0000 UTC m=+644.829223787" observedRunningTime="2026-04-21 02:00:47.561202746 +0000 UTC m=+645.140504804" watchObservedRunningTime="2026-04-21 02:00:47.562271867 +0000 UTC m=+645.141573924" Apr 21 02:00:47.574042 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:47.573996 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-5dfb5487b7-jpcw6" podStartSLOduration=1.2194123000000001 podStartE2EDuration="1.573981148s" podCreationTimestamp="2026-04-21 02:00:46 +0000 UTC" firstStartedPulling="2026-04-21 02:00:47.126206819 +0000 UTC m=+644.705508858" lastFinishedPulling="2026-04-21 02:00:47.480775656 +0000 UTC m=+645.060077706" observedRunningTime="2026-04-21 02:00:47.57281023 +0000 UTC m=+645.152127256" watchObservedRunningTime="2026-04-21 02:00:47.573981148 +0000 UTC m=+645.153283258" Apr 21 02:00:47.821782 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:47.821761 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5dfb5487b7-jpcw6" Apr 21 02:00:47.825142 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:47.825115 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-t7wk6" Apr 21 02:00:47.967325 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:47.967283 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt56p\" (UniqueName: \"kubernetes.io/projected/c72d24f1-ff60-4d5b-a35b-fcb2b191df2d-kube-api-access-xt56p\") pod \"c72d24f1-ff60-4d5b-a35b-fcb2b191df2d\" (UID: \"c72d24f1-ff60-4d5b-a35b-fcb2b191df2d\") " Apr 21 02:00:47.967520 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:47.967394 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvqxh\" (UniqueName: \"kubernetes.io/projected/f0c7aafb-c232-48c8-b928-404bda44dfbb-kube-api-access-wvqxh\") pod \"f0c7aafb-c232-48c8-b928-404bda44dfbb\" (UID: \"f0c7aafb-c232-48c8-b928-404bda44dfbb\") " Apr 21 02:00:47.969402 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:47.969377 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0c7aafb-c232-48c8-b928-404bda44dfbb-kube-api-access-wvqxh" (OuterVolumeSpecName: "kube-api-access-wvqxh") pod "f0c7aafb-c232-48c8-b928-404bda44dfbb" (UID: "f0c7aafb-c232-48c8-b928-404bda44dfbb"). InnerVolumeSpecName "kube-api-access-wvqxh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:00:47.969523 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:47.969500 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c72d24f1-ff60-4d5b-a35b-fcb2b191df2d-kube-api-access-xt56p" (OuterVolumeSpecName: "kube-api-access-xt56p") pod "c72d24f1-ff60-4d5b-a35b-fcb2b191df2d" (UID: "c72d24f1-ff60-4d5b-a35b-fcb2b191df2d"). InnerVolumeSpecName "kube-api-access-xt56p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:00:48.068565 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.068472 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xt56p\" (UniqueName: \"kubernetes.io/projected/c72d24f1-ff60-4d5b-a35b-fcb2b191df2d-kube-api-access-xt56p\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 02:00:48.068565 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.068505 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wvqxh\" (UniqueName: \"kubernetes.io/projected/f0c7aafb-c232-48c8-b928-404bda44dfbb-kube-api-access-wvqxh\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 02:00:48.554566 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.554530 2573 generic.go:358] "Generic (PLEG): container finished" podID="f0c7aafb-c232-48c8-b928-404bda44dfbb" containerID="2a6ac3e2be9e1077438e63c680b2c1b743ed69e005073d645e49b98d0ea584fd" exitCode=2 Apr 21 02:00:48.555067 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.554585 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5dfb5487b7-jpcw6" Apr 21 02:00:48.555067 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.554618 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5dfb5487b7-jpcw6" event={"ID":"f0c7aafb-c232-48c8-b928-404bda44dfbb","Type":"ContainerDied","Data":"2a6ac3e2be9e1077438e63c680b2c1b743ed69e005073d645e49b98d0ea584fd"} Apr 21 02:00:48.555067 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.554658 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5dfb5487b7-jpcw6" event={"ID":"f0c7aafb-c232-48c8-b928-404bda44dfbb","Type":"ContainerDied","Data":"f41015fdd1fb6c5149d37684ed8f2fc97707597c027858d0332a583489a1cf25"} Apr 21 02:00:48.555067 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.554676 2573 scope.go:117] "RemoveContainer" containerID="2a6ac3e2be9e1077438e63c680b2c1b743ed69e005073d645e49b98d0ea584fd" Apr 21 02:00:48.555966 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.555946 2573 generic.go:358] "Generic (PLEG): container finished" podID="c72d24f1-ff60-4d5b-a35b-fcb2b191df2d" containerID="ca66c4ae1670a6e9ce829b78de9735c562d21be43435e73d22bae462dacf9f93" exitCode=0 Apr 21 02:00:48.556053 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.555997 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-t7wk6" Apr 21 02:00:48.556053 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.556027 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-t7wk6" event={"ID":"c72d24f1-ff60-4d5b-a35b-fcb2b191df2d","Type":"ContainerDied","Data":"ca66c4ae1670a6e9ce829b78de9735c562d21be43435e73d22bae462dacf9f93"} Apr 21 02:00:48.556053 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.556051 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-t7wk6" event={"ID":"c72d24f1-ff60-4d5b-a35b-fcb2b191df2d","Type":"ContainerDied","Data":"4cbacad1d9b316f202c045b7f432e2909443ff5a89dd80870ef51957dce8a010"} Apr 21 02:00:48.570663 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.570634 2573 scope.go:117] "RemoveContainer" containerID="2a6ac3e2be9e1077438e63c680b2c1b743ed69e005073d645e49b98d0ea584fd" Apr 21 02:00:48.573270 ip-10-0-129-52 kubenswrapper[2573]: E0421 02:00:48.571950 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a6ac3e2be9e1077438e63c680b2c1b743ed69e005073d645e49b98d0ea584fd\": container with ID starting with 2a6ac3e2be9e1077438e63c680b2c1b743ed69e005073d645e49b98d0ea584fd not found: ID does not exist" containerID="2a6ac3e2be9e1077438e63c680b2c1b743ed69e005073d645e49b98d0ea584fd" Apr 21 02:00:48.573270 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.571989 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a6ac3e2be9e1077438e63c680b2c1b743ed69e005073d645e49b98d0ea584fd"} err="failed to get container status \"2a6ac3e2be9e1077438e63c680b2c1b743ed69e005073d645e49b98d0ea584fd\": rpc error: code = NotFound desc = could not find container \"2a6ac3e2be9e1077438e63c680b2c1b743ed69e005073d645e49b98d0ea584fd\": container with ID starting with 2a6ac3e2be9e1077438e63c680b2c1b743ed69e005073d645e49b98d0ea584fd not found: ID does not exist" Apr 21 02:00:48.573270 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.572018 2573 scope.go:117] "RemoveContainer" containerID="ca66c4ae1670a6e9ce829b78de9735c562d21be43435e73d22bae462dacf9f93" Apr 21 02:00:48.582052 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.582028 2573 scope.go:117] "RemoveContainer" containerID="ca66c4ae1670a6e9ce829b78de9735c562d21be43435e73d22bae462dacf9f93" Apr 21 02:00:48.582315 ip-10-0-129-52 kubenswrapper[2573]: E0421 02:00:48.582293 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca66c4ae1670a6e9ce829b78de9735c562d21be43435e73d22bae462dacf9f93\": container with ID starting with ca66c4ae1670a6e9ce829b78de9735c562d21be43435e73d22bae462dacf9f93 not found: ID does not exist" containerID="ca66c4ae1670a6e9ce829b78de9735c562d21be43435e73d22bae462dacf9f93" Apr 21 02:00:48.582381 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.582322 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca66c4ae1670a6e9ce829b78de9735c562d21be43435e73d22bae462dacf9f93"} err="failed to get container status \"ca66c4ae1670a6e9ce829b78de9735c562d21be43435e73d22bae462dacf9f93\": rpc error: code = NotFound desc = could not find container \"ca66c4ae1670a6e9ce829b78de9735c562d21be43435e73d22bae462dacf9f93\": container with ID starting with ca66c4ae1670a6e9ce829b78de9735c562d21be43435e73d22bae462dacf9f93 not found: ID does not exist" Apr 21 02:00:48.587201 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.587175 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5dfb5487b7-jpcw6"] Apr 21 02:00:48.592855 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.592824 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-5dfb5487b7-jpcw6"] Apr 21 02:00:48.601517 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.601493 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-t7wk6"] Apr 21 02:00:48.605233 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.605212 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-t7wk6"] Apr 21 02:00:48.993015 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.992979 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-dtbwv"] Apr 21 02:00:48.993245 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:48.993221 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-dtbwv" podUID="32050bca-f4f1-4659-a1e0-79a9ee9234c4" containerName="authorino" containerID="cri-o://e5547bd1ab38220bb3bb0e05db5a2ee0a77d92217f92f130b87244971afa339b" gracePeriod=30 Apr 21 02:00:49.063867 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.063834 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c72d24f1-ff60-4d5b-a35b-fcb2b191df2d" path="/var/lib/kubelet/pods/c72d24f1-ff60-4d5b-a35b-fcb2b191df2d/volumes" Apr 21 02:00:49.064176 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.064162 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0c7aafb-c232-48c8-b928-404bda44dfbb" path="/var/lib/kubelet/pods/f0c7aafb-c232-48c8-b928-404bda44dfbb/volumes" Apr 21 02:00:49.233236 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.233215 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-dtbwv" Apr 21 02:00:49.381999 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.381969 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sslf6\" (UniqueName: \"kubernetes.io/projected/32050bca-f4f1-4659-a1e0-79a9ee9234c4-kube-api-access-sslf6\") pod \"32050bca-f4f1-4659-a1e0-79a9ee9234c4\" (UID: \"32050bca-f4f1-4659-a1e0-79a9ee9234c4\") " Apr 21 02:00:49.384055 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.384031 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32050bca-f4f1-4659-a1e0-79a9ee9234c4-kube-api-access-sslf6" (OuterVolumeSpecName: "kube-api-access-sslf6") pod "32050bca-f4f1-4659-a1e0-79a9ee9234c4" (UID: "32050bca-f4f1-4659-a1e0-79a9ee9234c4"). InnerVolumeSpecName "kube-api-access-sslf6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:00:49.483079 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.483037 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sslf6\" (UniqueName: \"kubernetes.io/projected/32050bca-f4f1-4659-a1e0-79a9ee9234c4-kube-api-access-sslf6\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 02:00:49.564646 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.564606 2573 generic.go:358] "Generic (PLEG): container finished" podID="32050bca-f4f1-4659-a1e0-79a9ee9234c4" containerID="e5547bd1ab38220bb3bb0e05db5a2ee0a77d92217f92f130b87244971afa339b" exitCode=0 Apr 21 02:00:49.565071 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.564659 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-dtbwv" Apr 21 02:00:49.565071 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.564671 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-dtbwv" event={"ID":"32050bca-f4f1-4659-a1e0-79a9ee9234c4","Type":"ContainerDied","Data":"e5547bd1ab38220bb3bb0e05db5a2ee0a77d92217f92f130b87244971afa339b"} Apr 21 02:00:49.565071 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.564699 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-dtbwv" event={"ID":"32050bca-f4f1-4659-a1e0-79a9ee9234c4","Type":"ContainerDied","Data":"1beda11eeb08d57830dffb4aba5cf006116b90cffcbceb6209406a8dd2c742df"} Apr 21 02:00:49.565071 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.564714 2573 scope.go:117] "RemoveContainer" containerID="e5547bd1ab38220bb3bb0e05db5a2ee0a77d92217f92f130b87244971afa339b" Apr 21 02:00:49.574538 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.574517 2573 scope.go:117] "RemoveContainer" containerID="e5547bd1ab38220bb3bb0e05db5a2ee0a77d92217f92f130b87244971afa339b" Apr 21 02:00:49.574797 ip-10-0-129-52 kubenswrapper[2573]: E0421 02:00:49.574778 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5547bd1ab38220bb3bb0e05db5a2ee0a77d92217f92f130b87244971afa339b\": container with ID starting with e5547bd1ab38220bb3bb0e05db5a2ee0a77d92217f92f130b87244971afa339b not found: ID does not exist" containerID="e5547bd1ab38220bb3bb0e05db5a2ee0a77d92217f92f130b87244971afa339b" Apr 21 02:00:49.574874 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.574806 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5547bd1ab38220bb3bb0e05db5a2ee0a77d92217f92f130b87244971afa339b"} err="failed to get container status \"e5547bd1ab38220bb3bb0e05db5a2ee0a77d92217f92f130b87244971afa339b\": rpc error: code = NotFound desc = could not find container \"e5547bd1ab38220bb3bb0e05db5a2ee0a77d92217f92f130b87244971afa339b\": container with ID starting with e5547bd1ab38220bb3bb0e05db5a2ee0a77d92217f92f130b87244971afa339b not found: ID does not exist" Apr 21 02:00:49.592162 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.592126 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-dtbwv"] Apr 21 02:00:49.598525 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.598499 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-dtbwv"] Apr 21 02:00:49.830855 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.830758 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-bzgt6"] Apr 21 02:00:49.831224 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.831206 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c72d24f1-ff60-4d5b-a35b-fcb2b191df2d" containerName="authorino" Apr 21 02:00:49.831312 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.831227 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c72d24f1-ff60-4d5b-a35b-fcb2b191df2d" containerName="authorino" Apr 21 02:00:49.831312 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.831247 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32050bca-f4f1-4659-a1e0-79a9ee9234c4" containerName="authorino" Apr 21 02:00:49.831312 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.831255 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="32050bca-f4f1-4659-a1e0-79a9ee9234c4" containerName="authorino" Apr 21 02:00:49.831312 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.831282 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0c7aafb-c232-48c8-b928-404bda44dfbb" containerName="authorino" Apr 21 02:00:49.831312 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.831291 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0c7aafb-c232-48c8-b928-404bda44dfbb" containerName="authorino" Apr 21 02:00:49.831553 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.831395 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="32050bca-f4f1-4659-a1e0-79a9ee9234c4" containerName="authorino" Apr 21 02:00:49.831553 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.831410 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c72d24f1-ff60-4d5b-a35b-fcb2b191df2d" containerName="authorino" Apr 21 02:00:49.831553 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.831421 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f0c7aafb-c232-48c8-b928-404bda44dfbb" containerName="authorino" Apr 21 02:00:49.835895 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.835873 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-bzgt6" Apr 21 02:00:49.838733 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.838708 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-s2mck\"" Apr 21 02:00:49.842084 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.842056 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-bzgt6"] Apr 21 02:00:49.987518 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:49.987485 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhgww\" (UniqueName: \"kubernetes.io/projected/dcc3febf-2280-47e0-8c26-cf41acbfb209-kube-api-access-vhgww\") pod \"maas-controller-6d4c8f55f9-bzgt6\" (UID: \"dcc3febf-2280-47e0-8c26-cf41acbfb209\") " pod="opendatahub/maas-controller-6d4c8f55f9-bzgt6" Apr 21 02:00:50.088680 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:50.088591 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhgww\" (UniqueName: \"kubernetes.io/projected/dcc3febf-2280-47e0-8c26-cf41acbfb209-kube-api-access-vhgww\") pod \"maas-controller-6d4c8f55f9-bzgt6\" (UID: \"dcc3febf-2280-47e0-8c26-cf41acbfb209\") " pod="opendatahub/maas-controller-6d4c8f55f9-bzgt6" Apr 21 02:00:50.096496 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:50.096464 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhgww\" (UniqueName: \"kubernetes.io/projected/dcc3febf-2280-47e0-8c26-cf41acbfb209-kube-api-access-vhgww\") pod \"maas-controller-6d4c8f55f9-bzgt6\" (UID: \"dcc3febf-2280-47e0-8c26-cf41acbfb209\") " pod="opendatahub/maas-controller-6d4c8f55f9-bzgt6" Apr 21 02:00:50.147291 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:50.147258 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-bzgt6" Apr 21 02:00:50.271525 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:50.271496 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-bzgt6"] Apr 21 02:00:50.273035 ip-10-0-129-52 kubenswrapper[2573]: W0421 02:00:50.273005 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcc3febf_2280_47e0_8c26_cf41acbfb209.slice/crio-6b132a7fced9c288011b7a0021629cc6c9baaa4f6878e7c2192da1a4b4163d96 WatchSource:0}: Error finding container 6b132a7fced9c288011b7a0021629cc6c9baaa4f6878e7c2192da1a4b4163d96: Status 404 returned error can't find the container with id 6b132a7fced9c288011b7a0021629cc6c9baaa4f6878e7c2192da1a4b4163d96 Apr 21 02:00:50.570776 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:50.570738 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-bzgt6" event={"ID":"dcc3febf-2280-47e0-8c26-cf41acbfb209","Type":"ContainerStarted","Data":"6b132a7fced9c288011b7a0021629cc6c9baaa4f6878e7c2192da1a4b4163d96"} Apr 21 02:00:51.067727 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:51.067691 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32050bca-f4f1-4659-a1e0-79a9ee9234c4" path="/var/lib/kubelet/pods/32050bca-f4f1-4659-a1e0-79a9ee9234c4/volumes" Apr 21 02:00:52.582237 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:52.582182 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-bzgt6" event={"ID":"dcc3febf-2280-47e0-8c26-cf41acbfb209","Type":"ContainerStarted","Data":"61dc624f7b792887a44af2b3e6ea6f5d708e434eff05d72393168c1cbfe2dc3a"} Apr 21 02:00:52.582775 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:52.582265 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-bzgt6" Apr 21 02:00:52.606688 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:52.606625 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-bzgt6" podStartSLOduration=1.457821782 podStartE2EDuration="3.606603078s" podCreationTimestamp="2026-04-21 02:00:49 +0000 UTC" firstStartedPulling="2026-04-21 02:00:50.274376537 +0000 UTC m=+647.853678572" lastFinishedPulling="2026-04-21 02:00:52.423157825 +0000 UTC m=+650.002459868" observedRunningTime="2026-04-21 02:00:52.60309003 +0000 UTC m=+650.182392089" watchObservedRunningTime="2026-04-21 02:00:52.606603078 +0000 UTC m=+650.185905133" Apr 21 02:00:55.407659 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:00:55.407621 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6pbq9"] Apr 21 02:01:03.598084 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:03.598051 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6d4c8f55f9-bzgt6" Apr 21 02:01:05.016766 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:05.016733 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-bzgt6"] Apr 21 02:01:05.017176 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:05.016959 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-bzgt6" podUID="dcc3febf-2280-47e0-8c26-cf41acbfb209" containerName="manager" containerID="cri-o://61dc624f7b792887a44af2b3e6ea6f5d708e434eff05d72393168c1cbfe2dc3a" gracePeriod=10 Apr 21 02:01:05.278748 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:05.278668 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-bzgt6" Apr 21 02:01:05.340842 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:05.340777 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhgww\" (UniqueName: \"kubernetes.io/projected/dcc3febf-2280-47e0-8c26-cf41acbfb209-kube-api-access-vhgww\") pod \"dcc3febf-2280-47e0-8c26-cf41acbfb209\" (UID: \"dcc3febf-2280-47e0-8c26-cf41acbfb209\") " Apr 21 02:01:05.343017 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:05.342979 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcc3febf-2280-47e0-8c26-cf41acbfb209-kube-api-access-vhgww" (OuterVolumeSpecName: "kube-api-access-vhgww") pod "dcc3febf-2280-47e0-8c26-cf41acbfb209" (UID: "dcc3febf-2280-47e0-8c26-cf41acbfb209"). InnerVolumeSpecName "kube-api-access-vhgww". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:01:05.442139 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:05.442088 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vhgww\" (UniqueName: \"kubernetes.io/projected/dcc3febf-2280-47e0-8c26-cf41acbfb209-kube-api-access-vhgww\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 02:01:05.643619 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:05.643584 2573 generic.go:358] "Generic (PLEG): container finished" podID="dcc3febf-2280-47e0-8c26-cf41acbfb209" containerID="61dc624f7b792887a44af2b3e6ea6f5d708e434eff05d72393168c1cbfe2dc3a" exitCode=0 Apr 21 02:01:05.643783 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:05.643636 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-bzgt6" event={"ID":"dcc3febf-2280-47e0-8c26-cf41acbfb209","Type":"ContainerDied","Data":"61dc624f7b792887a44af2b3e6ea6f5d708e434eff05d72393168c1cbfe2dc3a"} Apr 21 02:01:05.643783 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:05.643648 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-bzgt6" Apr 21 02:01:05.643783 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:05.643660 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-bzgt6" event={"ID":"dcc3febf-2280-47e0-8c26-cf41acbfb209","Type":"ContainerDied","Data":"6b132a7fced9c288011b7a0021629cc6c9baaa4f6878e7c2192da1a4b4163d96"} Apr 21 02:01:05.643783 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:05.643676 2573 scope.go:117] "RemoveContainer" containerID="61dc624f7b792887a44af2b3e6ea6f5d708e434eff05d72393168c1cbfe2dc3a" Apr 21 02:01:05.654499 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:05.654476 2573 scope.go:117] "RemoveContainer" containerID="61dc624f7b792887a44af2b3e6ea6f5d708e434eff05d72393168c1cbfe2dc3a" Apr 21 02:01:05.654787 ip-10-0-129-52 kubenswrapper[2573]: E0421 02:01:05.654768 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61dc624f7b792887a44af2b3e6ea6f5d708e434eff05d72393168c1cbfe2dc3a\": container with ID starting with 61dc624f7b792887a44af2b3e6ea6f5d708e434eff05d72393168c1cbfe2dc3a not found: ID does not exist" containerID="61dc624f7b792887a44af2b3e6ea6f5d708e434eff05d72393168c1cbfe2dc3a" Apr 21 02:01:05.654883 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:05.654801 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61dc624f7b792887a44af2b3e6ea6f5d708e434eff05d72393168c1cbfe2dc3a"} err="failed to get container status \"61dc624f7b792887a44af2b3e6ea6f5d708e434eff05d72393168c1cbfe2dc3a\": rpc error: code = NotFound desc = could not find container \"61dc624f7b792887a44af2b3e6ea6f5d708e434eff05d72393168c1cbfe2dc3a\": container with ID starting with 61dc624f7b792887a44af2b3e6ea6f5d708e434eff05d72393168c1cbfe2dc3a not found: ID does not exist" Apr 21 02:01:05.667642 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:05.667610 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-bzgt6"] Apr 21 02:01:05.669897 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:05.669874 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-bzgt6"] Apr 21 02:01:07.064374 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:07.064337 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcc3febf-2280-47e0-8c26-cf41acbfb209" path="/var/lib/kubelet/pods/dcc3febf-2280-47e0-8c26-cf41acbfb209/volumes" Apr 21 02:01:25.717335 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:25.717299 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-55874489bb-5l46x"] Apr 21 02:01:25.717974 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:25.717846 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dcc3febf-2280-47e0-8c26-cf41acbfb209" containerName="manager" Apr 21 02:01:25.717974 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:25.717866 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcc3febf-2280-47e0-8c26-cf41acbfb209" containerName="manager" Apr 21 02:01:25.718115 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:25.717991 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="dcc3febf-2280-47e0-8c26-cf41acbfb209" containerName="manager" Apr 21 02:01:25.725508 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:25.725483 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-55874489bb-5l46x" Apr 21 02:01:25.729594 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:25.729570 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 21 02:01:25.729712 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:25.729628 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 21 02:01:25.729712 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:25.729569 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-fjsp7\"" Apr 21 02:01:25.737978 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:25.737952 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-55874489bb-5l46x"] Apr 21 02:01:25.832432 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:25.832397 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1c20f62f-3957-4098-8b95-fd58e30ca1cd-maas-api-tls\") pod \"maas-api-55874489bb-5l46x\" (UID: \"1c20f62f-3957-4098-8b95-fd58e30ca1cd\") " pod="opendatahub/maas-api-55874489bb-5l46x" Apr 21 02:01:25.832633 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:25.832437 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbtbk\" (UniqueName: \"kubernetes.io/projected/1c20f62f-3957-4098-8b95-fd58e30ca1cd-kube-api-access-lbtbk\") pod \"maas-api-55874489bb-5l46x\" (UID: \"1c20f62f-3957-4098-8b95-fd58e30ca1cd\") " pod="opendatahub/maas-api-55874489bb-5l46x" Apr 21 02:01:25.933509 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:25.933472 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1c20f62f-3957-4098-8b95-fd58e30ca1cd-maas-api-tls\") pod \"maas-api-55874489bb-5l46x\" (UID: \"1c20f62f-3957-4098-8b95-fd58e30ca1cd\") " pod="opendatahub/maas-api-55874489bb-5l46x" Apr 21 02:01:25.933509 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:25.933510 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbtbk\" (UniqueName: \"kubernetes.io/projected/1c20f62f-3957-4098-8b95-fd58e30ca1cd-kube-api-access-lbtbk\") pod \"maas-api-55874489bb-5l46x\" (UID: \"1c20f62f-3957-4098-8b95-fd58e30ca1cd\") " pod="opendatahub/maas-api-55874489bb-5l46x" Apr 21 02:01:25.936067 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:25.936043 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1c20f62f-3957-4098-8b95-fd58e30ca1cd-maas-api-tls\") pod \"maas-api-55874489bb-5l46x\" (UID: \"1c20f62f-3957-4098-8b95-fd58e30ca1cd\") " pod="opendatahub/maas-api-55874489bb-5l46x" Apr 21 02:01:25.941090 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:25.941063 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbtbk\" (UniqueName: \"kubernetes.io/projected/1c20f62f-3957-4098-8b95-fd58e30ca1cd-kube-api-access-lbtbk\") pod \"maas-api-55874489bb-5l46x\" (UID: \"1c20f62f-3957-4098-8b95-fd58e30ca1cd\") " pod="opendatahub/maas-api-55874489bb-5l46x" Apr 21 02:01:26.037310 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:26.037218 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-55874489bb-5l46x" Apr 21 02:01:26.170362 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:26.170272 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-55874489bb-5l46x"] Apr 21 02:01:26.725134 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:26.725101 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-55874489bb-5l46x" event={"ID":"1c20f62f-3957-4098-8b95-fd58e30ca1cd","Type":"ContainerStarted","Data":"d99e20d62886adb68fb84f9f1f95face659f244572db1d5a3493b8fbe64f7a3f"} Apr 21 02:01:28.734340 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:28.734306 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-55874489bb-5l46x" event={"ID":"1c20f62f-3957-4098-8b95-fd58e30ca1cd","Type":"ContainerStarted","Data":"8e2b0cf6fd9dca75da2d86307824977317b3c7cce705aa31547140683df46636"} Apr 21 02:01:28.734733 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:28.734368 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-55874489bb-5l46x" Apr 21 02:01:28.752583 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:28.752538 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-55874489bb-5l46x" podStartSLOduration=2.106080838 podStartE2EDuration="3.752524945s" podCreationTimestamp="2026-04-21 02:01:25 +0000 UTC" firstStartedPulling="2026-04-21 02:01:26.169923716 +0000 UTC m=+683.749225759" lastFinishedPulling="2026-04-21 02:01:27.816367827 +0000 UTC m=+685.395669866" observedRunningTime="2026-04-21 02:01:28.749505617 +0000 UTC m=+686.328807676" watchObservedRunningTime="2026-04-21 02:01:28.752524945 +0000 UTC m=+686.331827003" Apr 21 02:01:34.743547 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:34.743518 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-55874489bb-5l46x" Apr 21 02:01:57.252411 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:57.252325 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-85ddc86fdf-82kxh"] Apr 21 02:01:57.255798 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:57.255780 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-85ddc86fdf-82kxh" Apr 21 02:01:57.258697 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:57.258505 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"authorino-oidc-ca\"" Apr 21 02:01:57.259448 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:57.259423 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 21 02:01:57.259590 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:57.259423 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-lzqzq\"" Apr 21 02:01:57.262365 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:57.262341 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-85ddc86fdf-82kxh"] Apr 21 02:01:57.306399 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:57.306366 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/1875c8ac-649c-4035-bb58-6041c3d998ea-oidc-ca\") pod \"authorino-85ddc86fdf-82kxh\" (UID: \"1875c8ac-649c-4035-bb58-6041c3d998ea\") " pod="kuadrant-system/authorino-85ddc86fdf-82kxh" Apr 21 02:01:57.306600 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:57.306444 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5vw8\" (UniqueName: \"kubernetes.io/projected/1875c8ac-649c-4035-bb58-6041c3d998ea-kube-api-access-s5vw8\") pod \"authorino-85ddc86fdf-82kxh\" (UID: \"1875c8ac-649c-4035-bb58-6041c3d998ea\") " pod="kuadrant-system/authorino-85ddc86fdf-82kxh" Apr 21 02:01:57.306600 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:57.306495 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1875c8ac-649c-4035-bb58-6041c3d998ea-tls-cert\") pod \"authorino-85ddc86fdf-82kxh\" (UID: \"1875c8ac-649c-4035-bb58-6041c3d998ea\") " pod="kuadrant-system/authorino-85ddc86fdf-82kxh" Apr 21 02:01:57.407406 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:57.407361 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1875c8ac-649c-4035-bb58-6041c3d998ea-tls-cert\") pod \"authorino-85ddc86fdf-82kxh\" (UID: \"1875c8ac-649c-4035-bb58-6041c3d998ea\") " pod="kuadrant-system/authorino-85ddc86fdf-82kxh" Apr 21 02:01:57.407603 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:57.407420 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/1875c8ac-649c-4035-bb58-6041c3d998ea-oidc-ca\") pod \"authorino-85ddc86fdf-82kxh\" (UID: \"1875c8ac-649c-4035-bb58-6041c3d998ea\") " pod="kuadrant-system/authorino-85ddc86fdf-82kxh" Apr 21 02:01:57.407603 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:57.407480 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5vw8\" (UniqueName: \"kubernetes.io/projected/1875c8ac-649c-4035-bb58-6041c3d998ea-kube-api-access-s5vw8\") pod \"authorino-85ddc86fdf-82kxh\" (UID: \"1875c8ac-649c-4035-bb58-6041c3d998ea\") " pod="kuadrant-system/authorino-85ddc86fdf-82kxh" Apr 21 02:01:57.408208 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:57.408186 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/1875c8ac-649c-4035-bb58-6041c3d998ea-oidc-ca\") pod \"authorino-85ddc86fdf-82kxh\" (UID: \"1875c8ac-649c-4035-bb58-6041c3d998ea\") " pod="kuadrant-system/authorino-85ddc86fdf-82kxh" Apr 21 02:01:57.410015 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:57.409993 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1875c8ac-649c-4035-bb58-6041c3d998ea-tls-cert\") pod \"authorino-85ddc86fdf-82kxh\" (UID: \"1875c8ac-649c-4035-bb58-6041c3d998ea\") " pod="kuadrant-system/authorino-85ddc86fdf-82kxh" Apr 21 02:01:57.417527 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:57.417504 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5vw8\" (UniqueName: \"kubernetes.io/projected/1875c8ac-649c-4035-bb58-6041c3d998ea-kube-api-access-s5vw8\") pod \"authorino-85ddc86fdf-82kxh\" (UID: \"1875c8ac-649c-4035-bb58-6041c3d998ea\") " pod="kuadrant-system/authorino-85ddc86fdf-82kxh" Apr 21 02:01:57.566060 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:57.565965 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-85ddc86fdf-82kxh" Apr 21 02:01:57.695586 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:57.695557 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-85ddc86fdf-82kxh"] Apr 21 02:01:57.697181 ip-10-0-129-52 kubenswrapper[2573]: W0421 02:01:57.697150 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1875c8ac_649c_4035_bb58_6041c3d998ea.slice/crio-feeaf76789fa3f856877a74ca3dc33cac4349936f9527fb87ad864f1802dd558 WatchSource:0}: Error finding container feeaf76789fa3f856877a74ca3dc33cac4349936f9527fb87ad864f1802dd558: Status 404 returned error can't find the container with id feeaf76789fa3f856877a74ca3dc33cac4349936f9527fb87ad864f1802dd558 Apr 21 02:01:57.853197 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:57.853106 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-85ddc86fdf-82kxh" event={"ID":"1875c8ac-649c-4035-bb58-6041c3d998ea","Type":"ContainerStarted","Data":"feeaf76789fa3f856877a74ca3dc33cac4349936f9527fb87ad864f1802dd558"} Apr 21 02:01:58.859582 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:58.859546 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-85ddc86fdf-82kxh" event={"ID":"1875c8ac-649c-4035-bb58-6041c3d998ea","Type":"ContainerStarted","Data":"cbd9ea2f478e8a2b1f6135606fc39fb2bb42a8db94526a1595e020305f646374"} Apr 21 02:01:58.874380 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:01:58.874332 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-85ddc86fdf-82kxh" podStartSLOduration=1.269843863 podStartE2EDuration="1.874318252s" podCreationTimestamp="2026-04-21 02:01:57 +0000 UTC" firstStartedPulling="2026-04-21 02:01:57.698516975 +0000 UTC m=+715.277819010" lastFinishedPulling="2026-04-21 02:01:58.302991363 +0000 UTC m=+715.882293399" observedRunningTime="2026-04-21 02:01:58.872723938 +0000 UTC m=+716.452026020" watchObservedRunningTime="2026-04-21 02:01:58.874318252 +0000 UTC m=+716.453620310" Apr 21 02:02:07.809828 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:02:07.809783 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6pbq9"] Apr 21 02:02:17.106135 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:02:17.106098 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6pbq9"] Apr 21 02:02:22.277756 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:02:22.277709 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6pbq9"] Apr 21 02:02:34.568417 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:02:34.568380 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6pbq9"] Apr 21 02:02:39.666810 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:02:39.666768 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6pbq9"] Apr 21 02:02:51.667195 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:02:51.667158 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6pbq9"] Apr 21 02:03:29.657350 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:29.657315 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-668775b6cb-nh4bv"] Apr 21 02:03:29.660793 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:29.660775 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-668775b6cb-nh4bv" Apr 21 02:03:29.666663 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:29.666636 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-668775b6cb-nh4bv"] Apr 21 02:03:29.708807 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:29.708762 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/2895a5c8-73cb-4786-9e40-b0d81dc476ba-oidc-ca\") pod \"authorino-668775b6cb-nh4bv\" (UID: \"2895a5c8-73cb-4786-9e40-b0d81dc476ba\") " pod="kuadrant-system/authorino-668775b6cb-nh4bv" Apr 21 02:03:29.709026 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:29.708870 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmfbh\" (UniqueName: \"kubernetes.io/projected/2895a5c8-73cb-4786-9e40-b0d81dc476ba-kube-api-access-qmfbh\") pod \"authorino-668775b6cb-nh4bv\" (UID: \"2895a5c8-73cb-4786-9e40-b0d81dc476ba\") " pod="kuadrant-system/authorino-668775b6cb-nh4bv" Apr 21 02:03:29.709026 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:29.708987 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/2895a5c8-73cb-4786-9e40-b0d81dc476ba-tls-cert\") pod \"authorino-668775b6cb-nh4bv\" (UID: \"2895a5c8-73cb-4786-9e40-b0d81dc476ba\") " pod="kuadrant-system/authorino-668775b6cb-nh4bv" Apr 21 02:03:29.810212 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:29.810171 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/2895a5c8-73cb-4786-9e40-b0d81dc476ba-tls-cert\") pod \"authorino-668775b6cb-nh4bv\" (UID: \"2895a5c8-73cb-4786-9e40-b0d81dc476ba\") " pod="kuadrant-system/authorino-668775b6cb-nh4bv" Apr 21 02:03:29.810384 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:29.810223 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/2895a5c8-73cb-4786-9e40-b0d81dc476ba-oidc-ca\") pod \"authorino-668775b6cb-nh4bv\" (UID: \"2895a5c8-73cb-4786-9e40-b0d81dc476ba\") " pod="kuadrant-system/authorino-668775b6cb-nh4bv" Apr 21 02:03:29.810384 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:29.810262 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmfbh\" (UniqueName: \"kubernetes.io/projected/2895a5c8-73cb-4786-9e40-b0d81dc476ba-kube-api-access-qmfbh\") pod \"authorino-668775b6cb-nh4bv\" (UID: \"2895a5c8-73cb-4786-9e40-b0d81dc476ba\") " pod="kuadrant-system/authorino-668775b6cb-nh4bv" Apr 21 02:03:29.810968 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:29.810944 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/2895a5c8-73cb-4786-9e40-b0d81dc476ba-oidc-ca\") pod \"authorino-668775b6cb-nh4bv\" (UID: \"2895a5c8-73cb-4786-9e40-b0d81dc476ba\") " pod="kuadrant-system/authorino-668775b6cb-nh4bv" Apr 21 02:03:29.812774 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:29.812747 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/2895a5c8-73cb-4786-9e40-b0d81dc476ba-tls-cert\") pod \"authorino-668775b6cb-nh4bv\" (UID: \"2895a5c8-73cb-4786-9e40-b0d81dc476ba\") " pod="kuadrant-system/authorino-668775b6cb-nh4bv" Apr 21 02:03:29.817724 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:29.817695 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmfbh\" (UniqueName: \"kubernetes.io/projected/2895a5c8-73cb-4786-9e40-b0d81dc476ba-kube-api-access-qmfbh\") pod \"authorino-668775b6cb-nh4bv\" (UID: \"2895a5c8-73cb-4786-9e40-b0d81dc476ba\") " pod="kuadrant-system/authorino-668775b6cb-nh4bv" Apr 21 02:03:29.971477 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:29.971380 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-668775b6cb-nh4bv" Apr 21 02:03:30.093196 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:30.093171 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-668775b6cb-nh4bv"] Apr 21 02:03:30.094912 ip-10-0-129-52 kubenswrapper[2573]: W0421 02:03:30.094885 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2895a5c8_73cb_4786_9e40_b0d81dc476ba.slice/crio-5cf4781db5c24fbf1d6e06071d144596140389e7b622acbd5a114672958f78db WatchSource:0}: Error finding container 5cf4781db5c24fbf1d6e06071d144596140389e7b622acbd5a114672958f78db: Status 404 returned error can't find the container with id 5cf4781db5c24fbf1d6e06071d144596140389e7b622acbd5a114672958f78db Apr 21 02:03:30.234906 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:30.234874 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-668775b6cb-nh4bv" event={"ID":"2895a5c8-73cb-4786-9e40-b0d81dc476ba","Type":"ContainerStarted","Data":"5cf4781db5c24fbf1d6e06071d144596140389e7b622acbd5a114672958f78db"} Apr 21 02:03:30.386878 ip-10-0-129-52 kubenswrapper[2573]: E0421 02:03:30.386801 2573 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/kuadrant/authorino:v0.24.0: reading manifest v0.24.0 in quay.io/kuadrant/authorino: received unexpected HTTP status: 502 Bad Gateway; artifact err: provided artifact is a container image" image="quay.io/kuadrant/authorino:v0.24.0" Apr 21 02:03:30.387062 ip-10-0-129-52 kubenswrapper[2573]: E0421 02:03:30.387028 2573 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:authorino,Image:quay.io/kuadrant/authorino:v0.24.0,Command:[],Args:[--allow-superseding-host-subsets --log-level=debug --tls-cert=/etc/ssl/certs/tls.crt --tls-cert-key=/etc/ssl/private/tls.key],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SSL_CERT_FILE,Value:/etc/ssl/certs/openshift-service-ca/service-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:REQUESTS_CA_BUNDLE,Value:/etc/ssl/certs/openshift-service-ca/service-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tls-cert,ReadOnly:true,MountPath:/etc/ssl/certs/tls.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-cert,ReadOnly:true,MountPath:/etc/ssl/private/tls.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:oidc-ca,ReadOnly:true,MountPath:/etc/ssl/certs/oidc-ca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qmfbh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000680000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod authorino-668775b6cb-nh4bv_kuadrant-system(2895a5c8-73cb-4786-9e40-b0d81dc476ba): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/kuadrant/authorino:v0.24.0: reading manifest v0.24.0 in quay.io/kuadrant/authorino: received unexpected HTTP status: 502 Bad Gateway; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 21 02:03:30.388204 ip-10-0-129-52 kubenswrapper[2573]: E0421 02:03:30.388173 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authorino\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/kuadrant/authorino:v0.24.0: reading manifest v0.24.0 in quay.io/kuadrant/authorino: received unexpected HTTP status: 502 Bad Gateway; artifact err: provided artifact is a container image\"" pod="kuadrant-system/authorino-668775b6cb-nh4bv" podUID="2895a5c8-73cb-4786-9e40-b0d81dc476ba" Apr 21 02:03:31.239882 ip-10-0-129-52 kubenswrapper[2573]: E0421 02:03:31.239841 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authorino\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/kuadrant/authorino:v0.24.0\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/kuadrant/authorino:v0.24.0: reading manifest v0.24.0 in quay.io/kuadrant/authorino: received unexpected HTTP status: 502 Bad Gateway; artifact err: provided artifact is a container image\"" pod="kuadrant-system/authorino-668775b6cb-nh4bv" podUID="2895a5c8-73cb-4786-9e40-b0d81dc476ba" Apr 21 02:03:44.295384 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:44.295343 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-668775b6cb-nh4bv" event={"ID":"2895a5c8-73cb-4786-9e40-b0d81dc476ba","Type":"ContainerStarted","Data":"9be77bb79c1479257d18a8702ed46ca9ca1c6baf2e79cb8a5fa9e9bf2df96a3b"} Apr 21 02:03:44.312501 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:44.312448 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-668775b6cb-nh4bv" podStartSLOduration=2.025953766 podStartE2EDuration="15.31243341s" podCreationTimestamp="2026-04-21 02:03:29 +0000 UTC" firstStartedPulling="2026-04-21 02:03:30.09657516 +0000 UTC m=+807.675877196" lastFinishedPulling="2026-04-21 02:03:43.383054795 +0000 UTC m=+820.962356840" observedRunningTime="2026-04-21 02:03:44.309192552 +0000 UTC m=+821.888494610" watchObservedRunningTime="2026-04-21 02:03:44.31243341 +0000 UTC m=+821.891735485" Apr 21 02:03:44.336650 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:44.336609 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-85ddc86fdf-82kxh"] Apr 21 02:03:44.336873 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:44.336839 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-85ddc86fdf-82kxh" podUID="1875c8ac-649c-4035-bb58-6041c3d998ea" containerName="authorino" containerID="cri-o://cbd9ea2f478e8a2b1f6135606fc39fb2bb42a8db94526a1595e020305f646374" gracePeriod=30 Apr 21 02:03:44.576872 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:44.576809 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-85ddc86fdf-82kxh" Apr 21 02:03:44.641907 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:44.641866 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5vw8\" (UniqueName: \"kubernetes.io/projected/1875c8ac-649c-4035-bb58-6041c3d998ea-kube-api-access-s5vw8\") pod \"1875c8ac-649c-4035-bb58-6041c3d998ea\" (UID: \"1875c8ac-649c-4035-bb58-6041c3d998ea\") " Apr 21 02:03:44.641907 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:44.641920 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1875c8ac-649c-4035-bb58-6041c3d998ea-tls-cert\") pod \"1875c8ac-649c-4035-bb58-6041c3d998ea\" (UID: \"1875c8ac-649c-4035-bb58-6041c3d998ea\") " Apr 21 02:03:44.642161 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:44.641967 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/1875c8ac-649c-4035-bb58-6041c3d998ea-oidc-ca\") pod \"1875c8ac-649c-4035-bb58-6041c3d998ea\" (UID: \"1875c8ac-649c-4035-bb58-6041c3d998ea\") " Apr 21 02:03:44.644112 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:44.644081 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1875c8ac-649c-4035-bb58-6041c3d998ea-kube-api-access-s5vw8" (OuterVolumeSpecName: "kube-api-access-s5vw8") pod "1875c8ac-649c-4035-bb58-6041c3d998ea" (UID: "1875c8ac-649c-4035-bb58-6041c3d998ea"). InnerVolumeSpecName "kube-api-access-s5vw8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 02:03:44.647344 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:44.647311 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1875c8ac-649c-4035-bb58-6041c3d998ea-oidc-ca" (OuterVolumeSpecName: "oidc-ca") pod "1875c8ac-649c-4035-bb58-6041c3d998ea" (UID: "1875c8ac-649c-4035-bb58-6041c3d998ea"). InnerVolumeSpecName "oidc-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 02:03:44.652332 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:44.652305 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1875c8ac-649c-4035-bb58-6041c3d998ea-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "1875c8ac-649c-4035-bb58-6041c3d998ea" (UID: "1875c8ac-649c-4035-bb58-6041c3d998ea"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 02:03:44.742692 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:44.742654 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s5vw8\" (UniqueName: \"kubernetes.io/projected/1875c8ac-649c-4035-bb58-6041c3d998ea-kube-api-access-s5vw8\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 02:03:44.742692 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:44.742683 2573 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1875c8ac-649c-4035-bb58-6041c3d998ea-tls-cert\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 02:03:44.742692 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:44.742693 2573 reconciler_common.go:299] "Volume detached for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/1875c8ac-649c-4035-bb58-6041c3d998ea-oidc-ca\") on node \"ip-10-0-129-52.ec2.internal\" DevicePath \"\"" Apr 21 02:03:45.300880 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:45.300833 2573 generic.go:358] "Generic (PLEG): container finished" podID="1875c8ac-649c-4035-bb58-6041c3d998ea" containerID="cbd9ea2f478e8a2b1f6135606fc39fb2bb42a8db94526a1595e020305f646374" exitCode=0 Apr 21 02:03:45.301304 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:45.300902 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-85ddc86fdf-82kxh" Apr 21 02:03:45.301304 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:45.300906 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-85ddc86fdf-82kxh" event={"ID":"1875c8ac-649c-4035-bb58-6041c3d998ea","Type":"ContainerDied","Data":"cbd9ea2f478e8a2b1f6135606fc39fb2bb42a8db94526a1595e020305f646374"} Apr 21 02:03:45.301304 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:45.300940 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-85ddc86fdf-82kxh" event={"ID":"1875c8ac-649c-4035-bb58-6041c3d998ea","Type":"ContainerDied","Data":"feeaf76789fa3f856877a74ca3dc33cac4349936f9527fb87ad864f1802dd558"} Apr 21 02:03:45.301304 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:45.300970 2573 scope.go:117] "RemoveContainer" containerID="cbd9ea2f478e8a2b1f6135606fc39fb2bb42a8db94526a1595e020305f646374" Apr 21 02:03:45.310438 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:45.310413 2573 scope.go:117] "RemoveContainer" containerID="cbd9ea2f478e8a2b1f6135606fc39fb2bb42a8db94526a1595e020305f646374" Apr 21 02:03:45.310720 ip-10-0-129-52 kubenswrapper[2573]: E0421 02:03:45.310701 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbd9ea2f478e8a2b1f6135606fc39fb2bb42a8db94526a1595e020305f646374\": container with ID starting with cbd9ea2f478e8a2b1f6135606fc39fb2bb42a8db94526a1595e020305f646374 not found: ID does not exist" containerID="cbd9ea2f478e8a2b1f6135606fc39fb2bb42a8db94526a1595e020305f646374" Apr 21 02:03:45.310764 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:45.310730 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbd9ea2f478e8a2b1f6135606fc39fb2bb42a8db94526a1595e020305f646374"} err="failed to get container status \"cbd9ea2f478e8a2b1f6135606fc39fb2bb42a8db94526a1595e020305f646374\": rpc error: code = NotFound desc = could not find container \"cbd9ea2f478e8a2b1f6135606fc39fb2bb42a8db94526a1595e020305f646374\": container with ID starting with cbd9ea2f478e8a2b1f6135606fc39fb2bb42a8db94526a1595e020305f646374 not found: ID does not exist" Apr 21 02:03:45.316650 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:45.316616 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-85ddc86fdf-82kxh"] Apr 21 02:03:45.319066 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:45.319042 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-85ddc86fdf-82kxh"] Apr 21 02:03:47.063482 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:03:47.063446 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1875c8ac-649c-4035-bb58-6041c3d998ea" path="/var/lib/kubelet/pods/1875c8ac-649c-4035-bb58-6041c3d998ea/volumes" Apr 21 02:04:24.275734 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:24.275702 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-668775b6cb-nh4bv_2895a5c8-73cb-4786-9e40-b0d81dc476ba/authorino/0.log" Apr 21 02:04:28.189317 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:28.189284 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-55874489bb-5l46x_1c20f62f-3957-4098-8b95-fd58e30ca1cd/maas-api/0.log" Apr 21 02:04:28.765134 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:28.765101 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-64bbc69db5-jqbhx_a421cc19-79ad-41ba-8dfc-971995cc31a0/manager/0.log" Apr 21 02:04:29.640871 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:29.640829 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt_f02c586b-d484-49c2-aa48-94fc19cf622b/pull/0.log" Apr 21 02:04:29.646971 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:29.646951 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt_f02c586b-d484-49c2-aa48-94fc19cf622b/extract/0.log" Apr 21 02:04:29.652877 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:29.652854 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt_f02c586b-d484-49c2-aa48-94fc19cf622b/util/0.log" Apr 21 02:04:29.763126 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:29.763097 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f_1000fd17-2fc7-4cc5-bdb8-15d3062b2f81/util/0.log" Apr 21 02:04:29.768569 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:29.768547 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f_1000fd17-2fc7-4cc5-bdb8-15d3062b2f81/pull/0.log" Apr 21 02:04:29.774056 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:29.774026 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f_1000fd17-2fc7-4cc5-bdb8-15d3062b2f81/extract/0.log" Apr 21 02:04:29.879539 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:29.879511 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6_04d6ec82-73db-455a-856e-49cff06f126b/util/0.log" Apr 21 02:04:29.885325 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:29.885303 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6_04d6ec82-73db-455a-856e-49cff06f126b/pull/0.log" Apr 21 02:04:29.890929 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:29.890861 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6_04d6ec82-73db-455a-856e-49cff06f126b/extract/0.log" Apr 21 02:04:30.005473 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:30.005432 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc_b88f3b10-9e35-4408-8341-e1cac20e0332/pull/0.log" Apr 21 02:04:30.011198 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:30.011160 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc_b88f3b10-9e35-4408-8341-e1cac20e0332/extract/0.log" Apr 21 02:04:30.016674 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:30.016658 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc_b88f3b10-9e35-4408-8341-e1cac20e0332/util/0.log" Apr 21 02:04:30.138917 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:30.138882 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-668775b6cb-nh4bv_2895a5c8-73cb-4786-9e40-b0d81dc476ba/authorino/0.log" Apr 21 02:04:30.483987 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:30.483959 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-klfxt_0678367a-88d0-4159-a101-f1cb29cee691/kuadrant-console-plugin/0.log" Apr 21 02:04:30.823247 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:30.823174 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-6pbq9_acc755ef-95d8-4417-b574-be065be2272b/limitador/0.log" Apr 21 02:04:31.491074 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:31.491046 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-56c874cd68-dzm28_530dc938-f98c-4492-a65a-20e3a9d4750c/kube-auth-proxy/0.log" Apr 21 02:04:31.827238 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:31.827159 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-65f9585684-glq78_70601297-057d-42c8-bef0-315d6797ccfd/router/0.log" Apr 21 02:04:36.341522 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:36.341479 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5xmbv/must-gather-jcj92"] Apr 21 02:04:36.342042 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:36.341934 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1875c8ac-649c-4035-bb58-6041c3d998ea" containerName="authorino" Apr 21 02:04:36.342042 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:36.341951 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1875c8ac-649c-4035-bb58-6041c3d998ea" containerName="authorino" Apr 21 02:04:36.342042 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:36.342013 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1875c8ac-649c-4035-bb58-6041c3d998ea" containerName="authorino" Apr 21 02:04:36.345388 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:36.345370 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5xmbv/must-gather-jcj92" Apr 21 02:04:36.348522 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:36.348356 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5xmbv\"/\"openshift-service-ca.crt\"" Apr 21 02:04:36.348522 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:36.348475 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5xmbv\"/\"default-dockercfg-whz96\"" Apr 21 02:04:36.348522 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:36.348525 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5xmbv\"/\"kube-root-ca.crt\"" Apr 21 02:04:36.355447 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:36.355421 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5xmbv/must-gather-jcj92"] Apr 21 02:04:36.405231 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:36.405194 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ba92b346-496d-4271-a0af-8cc517f8e340-must-gather-output\") pod \"must-gather-jcj92\" (UID: \"ba92b346-496d-4271-a0af-8cc517f8e340\") " pod="openshift-must-gather-5xmbv/must-gather-jcj92" Apr 21 02:04:36.405418 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:36.405328 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lf9l\" (UniqueName: \"kubernetes.io/projected/ba92b346-496d-4271-a0af-8cc517f8e340-kube-api-access-4lf9l\") pod \"must-gather-jcj92\" (UID: \"ba92b346-496d-4271-a0af-8cc517f8e340\") " pod="openshift-must-gather-5xmbv/must-gather-jcj92" Apr 21 02:04:36.506054 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:36.506012 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ba92b346-496d-4271-a0af-8cc517f8e340-must-gather-output\") pod \"must-gather-jcj92\" (UID: \"ba92b346-496d-4271-a0af-8cc517f8e340\") " pod="openshift-must-gather-5xmbv/must-gather-jcj92" Apr 21 02:04:36.506224 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:36.506141 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4lf9l\" (UniqueName: \"kubernetes.io/projected/ba92b346-496d-4271-a0af-8cc517f8e340-kube-api-access-4lf9l\") pod \"must-gather-jcj92\" (UID: \"ba92b346-496d-4271-a0af-8cc517f8e340\") " pod="openshift-must-gather-5xmbv/must-gather-jcj92" Apr 21 02:04:36.506365 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:36.506345 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ba92b346-496d-4271-a0af-8cc517f8e340-must-gather-output\") pod \"must-gather-jcj92\" (UID: \"ba92b346-496d-4271-a0af-8cc517f8e340\") " pod="openshift-must-gather-5xmbv/must-gather-jcj92" Apr 21 02:04:36.514491 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:36.514463 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lf9l\" (UniqueName: \"kubernetes.io/projected/ba92b346-496d-4271-a0af-8cc517f8e340-kube-api-access-4lf9l\") pod \"must-gather-jcj92\" (UID: \"ba92b346-496d-4271-a0af-8cc517f8e340\") " pod="openshift-must-gather-5xmbv/must-gather-jcj92" Apr 21 02:04:36.655508 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:36.655469 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5xmbv/must-gather-jcj92" Apr 21 02:04:36.781180 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:36.781153 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5xmbv/must-gather-jcj92"] Apr 21 02:04:36.782044 ip-10-0-129-52 kubenswrapper[2573]: W0421 02:04:36.782016 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba92b346_496d_4271_a0af_8cc517f8e340.slice/crio-7281effa23565e16f282c05ee093499d04c174e6f6d2f5dc578a4b69c37f1480 WatchSource:0}: Error finding container 7281effa23565e16f282c05ee093499d04c174e6f6d2f5dc578a4b69c37f1480: Status 404 returned error can't find the container with id 7281effa23565e16f282c05ee093499d04c174e6f6d2f5dc578a4b69c37f1480 Apr 21 02:04:37.512988 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:37.512941 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5xmbv/must-gather-jcj92" event={"ID":"ba92b346-496d-4271-a0af-8cc517f8e340","Type":"ContainerStarted","Data":"7281effa23565e16f282c05ee093499d04c174e6f6d2f5dc578a4b69c37f1480"} Apr 21 02:04:38.518978 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:38.518946 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5xmbv/must-gather-jcj92" event={"ID":"ba92b346-496d-4271-a0af-8cc517f8e340","Type":"ContainerStarted","Data":"0b178cb77819813877d9d3c6de3887d990d1eb996d305b1bd52d7786281c3f34"} Apr 21 02:04:38.518978 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:38.518981 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5xmbv/must-gather-jcj92" event={"ID":"ba92b346-496d-4271-a0af-8cc517f8e340","Type":"ContainerStarted","Data":"b3c0605ed7a26953f3c87a1544e17c3d58f7eab650ae69241d0af81993ae5fa6"} Apr 21 02:04:38.533315 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:38.533255 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5xmbv/must-gather-jcj92" podStartSLOduration=1.6261914769999999 podStartE2EDuration="2.533235596s" podCreationTimestamp="2026-04-21 02:04:36 +0000 UTC" firstStartedPulling="2026-04-21 02:04:36.783744455 +0000 UTC m=+874.363046491" lastFinishedPulling="2026-04-21 02:04:37.690788574 +0000 UTC m=+875.270090610" observedRunningTime="2026-04-21 02:04:38.533105273 +0000 UTC m=+876.112407368" watchObservedRunningTime="2026-04-21 02:04:38.533235596 +0000 UTC m=+876.112537655" Apr 21 02:04:39.382057 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:39.382025 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-dxbng_be58fe9c-7c3d-40c0-9c75-448c5d3d856c/global-pull-secret-syncer/0.log" Apr 21 02:04:39.454982 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:39.454948 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-l94wt_25059272-61bd-4d87-9141-9036eaa06ce3/konnectivity-agent/0.log" Apr 21 02:04:39.519875 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:39.519841 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-52.ec2.internal_bc7f96eee01b7ac864ac98f5fb6e45b4/haproxy/0.log" Apr 21 02:04:43.202427 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:43.202397 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt_f02c586b-d484-49c2-aa48-94fc19cf622b/extract/0.log" Apr 21 02:04:43.223047 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:43.223016 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt_f02c586b-d484-49c2-aa48-94fc19cf622b/util/0.log" Apr 21 02:04:43.251177 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:43.251122 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759tjhgt_f02c586b-d484-49c2-aa48-94fc19cf622b/pull/0.log" Apr 21 02:04:43.277449 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:43.276284 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f_1000fd17-2fc7-4cc5-bdb8-15d3062b2f81/extract/0.log" Apr 21 02:04:43.300006 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:43.299476 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f_1000fd17-2fc7-4cc5-bdb8-15d3062b2f81/util/0.log" Apr 21 02:04:43.320389 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:43.320358 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0djk5f_1000fd17-2fc7-4cc5-bdb8-15d3062b2f81/pull/0.log" Apr 21 02:04:43.345175 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:43.345135 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6_04d6ec82-73db-455a-856e-49cff06f126b/extract/0.log" Apr 21 02:04:43.378611 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:43.378581 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6_04d6ec82-73db-455a-856e-49cff06f126b/util/0.log" Apr 21 02:04:43.395533 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:43.395489 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vqlr6_04d6ec82-73db-455a-856e-49cff06f126b/pull/0.log" Apr 21 02:04:43.426404 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:43.426370 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc_b88f3b10-9e35-4408-8341-e1cac20e0332/extract/0.log" Apr 21 02:04:43.448809 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:43.448780 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc_b88f3b10-9e35-4408-8341-e1cac20e0332/util/0.log" Apr 21 02:04:43.469654 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:43.469530 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1csfqc_b88f3b10-9e35-4408-8341-e1cac20e0332/pull/0.log" Apr 21 02:04:43.516433 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:43.516397 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-668775b6cb-nh4bv_2895a5c8-73cb-4786-9e40-b0d81dc476ba/authorino/0.log" Apr 21 02:04:43.599675 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:43.599637 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-klfxt_0678367a-88d0-4159-a101-f1cb29cee691/kuadrant-console-plugin/0.log" Apr 21 02:04:43.684720 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:43.684693 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-6pbq9_acc755ef-95d8-4417-b574-be065be2272b/limitador/0.log" Apr 21 02:04:44.914315 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:44.914289 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d162abce-05f8-4e2b-9911-404d5bc08b63/alertmanager/0.log" Apr 21 02:04:44.936494 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:44.936463 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d162abce-05f8-4e2b-9911-404d5bc08b63/config-reloader/0.log" Apr 21 02:04:44.965337 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:44.965309 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d162abce-05f8-4e2b-9911-404d5bc08b63/kube-rbac-proxy-web/0.log" Apr 21 02:04:44.986413 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:44.986385 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d162abce-05f8-4e2b-9911-404d5bc08b63/kube-rbac-proxy/0.log" Apr 21 02:04:45.010003 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.009939 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d162abce-05f8-4e2b-9911-404d5bc08b63/kube-rbac-proxy-metric/0.log" Apr 21 02:04:45.034107 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.034082 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d162abce-05f8-4e2b-9911-404d5bc08b63/prom-label-proxy/0.log" Apr 21 02:04:45.058220 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.058176 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d162abce-05f8-4e2b-9911-404d5bc08b63/init-config-reloader/0.log" Apr 21 02:04:45.113989 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.113962 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-bklrp_a77bc043-0862-487d-a125-8ad685a25aed/kube-state-metrics/0.log" Apr 21 02:04:45.134283 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.134254 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-bklrp_a77bc043-0862-487d-a125-8ad685a25aed/kube-rbac-proxy-main/0.log" Apr 21 02:04:45.155953 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.155921 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-bklrp_a77bc043-0862-487d-a125-8ad685a25aed/kube-rbac-proxy-self/0.log" Apr 21 02:04:45.180887 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.180792 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-f9b78694b-4qqtk_c3078e3f-e828-4443-9fa2-f548a350e468/metrics-server/0.log" Apr 21 02:04:45.347314 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.347287 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-npfqq_d46793fb-d7db-4f84-895f-8b7c420d58ab/node-exporter/0.log" Apr 21 02:04:45.369398 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.369368 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-npfqq_d46793fb-d7db-4f84-895f-8b7c420d58ab/kube-rbac-proxy/0.log" Apr 21 02:04:45.398013 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.397978 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-npfqq_d46793fb-d7db-4f84-895f-8b7c420d58ab/init-textfile/0.log" Apr 21 02:04:45.497402 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.497328 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gptvn_48803557-e6ce-4aa0-8b91-f591cb28551b/kube-rbac-proxy-main/0.log" Apr 21 02:04:45.517558 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.517528 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gptvn_48803557-e6ce-4aa0-8b91-f591cb28551b/kube-rbac-proxy-self/0.log" Apr 21 02:04:45.538200 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.538165 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gptvn_48803557-e6ce-4aa0-8b91-f591cb28551b/openshift-state-metrics/0.log" Apr 21 02:04:45.739017 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.738984 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-2vpnk_820f6e10-eab2-4509-ab5a-c5aca3e4b769/prometheus-operator/0.log" Apr 21 02:04:45.756562 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.756475 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-2vpnk_820f6e10-eab2-4509-ab5a-c5aca3e4b769/kube-rbac-proxy/0.log" Apr 21 02:04:45.784390 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.784348 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-pnf4r_84f2628a-fb6e-4827-bbeb-2a4ae45d0b29/prometheus-operator-admission-webhook/0.log" Apr 21 02:04:45.826119 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.826092 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7b7444b98f-2vtcr_671f9929-3233-4c87-98c8-8cbd3f38929c/telemeter-client/0.log" Apr 21 02:04:45.844208 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.844180 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7b7444b98f-2vtcr_671f9929-3233-4c87-98c8-8cbd3f38929c/reload/0.log" Apr 21 02:04:45.866108 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.866040 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7b7444b98f-2vtcr_671f9929-3233-4c87-98c8-8cbd3f38929c/kube-rbac-proxy/0.log" Apr 21 02:04:45.895628 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.895573 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79669bdb7f-dcqbv_f583a61c-5c94-4c0a-bc35-38bbbde0ca46/thanos-query/0.log" Apr 21 02:04:45.914149 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.914110 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79669bdb7f-dcqbv_f583a61c-5c94-4c0a-bc35-38bbbde0ca46/kube-rbac-proxy-web/0.log" Apr 21 02:04:45.935626 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.935599 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79669bdb7f-dcqbv_f583a61c-5c94-4c0a-bc35-38bbbde0ca46/kube-rbac-proxy/0.log" Apr 21 02:04:45.959196 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.959056 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79669bdb7f-dcqbv_f583a61c-5c94-4c0a-bc35-38bbbde0ca46/prom-label-proxy/0.log" Apr 21 02:04:45.983187 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:45.983161 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79669bdb7f-dcqbv_f583a61c-5c94-4c0a-bc35-38bbbde0ca46/kube-rbac-proxy-rules/0.log" Apr 21 02:04:46.004479 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:46.004450 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-79669bdb7f-dcqbv_f583a61c-5c94-4c0a-bc35-38bbbde0ca46/kube-rbac-proxy-metrics/0.log" Apr 21 02:04:48.038829 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:48.038777 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx"] Apr 21 02:04:48.050131 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:48.050093 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" Apr 21 02:04:48.055087 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:48.053852 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx"] Apr 21 02:04:48.128859 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:48.128678 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/01ff70b5-f62b-459e-b720-17945991b1d5-lib-modules\") pod \"perf-node-gather-daemonset-r45lx\" (UID: \"01ff70b5-f62b-459e-b720-17945991b1d5\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" Apr 21 02:04:48.128859 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:48.128796 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr999\" (UniqueName: \"kubernetes.io/projected/01ff70b5-f62b-459e-b720-17945991b1d5-kube-api-access-wr999\") pod \"perf-node-gather-daemonset-r45lx\" (UID: \"01ff70b5-f62b-459e-b720-17945991b1d5\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" Apr 21 02:04:48.128859 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:48.128859 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/01ff70b5-f62b-459e-b720-17945991b1d5-proc\") pod \"perf-node-gather-daemonset-r45lx\" (UID: \"01ff70b5-f62b-459e-b720-17945991b1d5\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" Apr 21 02:04:48.129146 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:48.128885 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/01ff70b5-f62b-459e-b720-17945991b1d5-podres\") pod \"perf-node-gather-daemonset-r45lx\" (UID: \"01ff70b5-f62b-459e-b720-17945991b1d5\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" Apr 21 02:04:48.129146 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:48.128924 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/01ff70b5-f62b-459e-b720-17945991b1d5-sys\") pod \"perf-node-gather-daemonset-r45lx\" (UID: \"01ff70b5-f62b-459e-b720-17945991b1d5\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" Apr 21 02:04:48.213115 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:48.213087 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f6957968c-9wh6s_66283c5e-bf84-4eca-adf5-0e75883b19b2/console/0.log" Apr 21 02:04:48.229953 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:48.229914 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wr999\" (UniqueName: \"kubernetes.io/projected/01ff70b5-f62b-459e-b720-17945991b1d5-kube-api-access-wr999\") pod \"perf-node-gather-daemonset-r45lx\" (UID: \"01ff70b5-f62b-459e-b720-17945991b1d5\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" Apr 21 02:04:48.230139 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:48.229981 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/01ff70b5-f62b-459e-b720-17945991b1d5-proc\") pod \"perf-node-gather-daemonset-r45lx\" (UID: \"01ff70b5-f62b-459e-b720-17945991b1d5\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" Apr 21 02:04:48.230139 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:48.230005 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/01ff70b5-f62b-459e-b720-17945991b1d5-podres\") pod \"perf-node-gather-daemonset-r45lx\" (UID: \"01ff70b5-f62b-459e-b720-17945991b1d5\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" Apr 21 02:04:48.230139 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:48.230042 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/01ff70b5-f62b-459e-b720-17945991b1d5-sys\") pod \"perf-node-gather-daemonset-r45lx\" (UID: \"01ff70b5-f62b-459e-b720-17945991b1d5\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" Apr 21 02:04:48.230139 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:48.230119 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/01ff70b5-f62b-459e-b720-17945991b1d5-lib-modules\") pod \"perf-node-gather-daemonset-r45lx\" (UID: \"01ff70b5-f62b-459e-b720-17945991b1d5\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" Apr 21 02:04:48.230364 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:48.230246 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/01ff70b5-f62b-459e-b720-17945991b1d5-sys\") pod \"perf-node-gather-daemonset-r45lx\" (UID: \"01ff70b5-f62b-459e-b720-17945991b1d5\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" Apr 21 02:04:48.230364 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:48.230284 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/01ff70b5-f62b-459e-b720-17945991b1d5-podres\") pod \"perf-node-gather-daemonset-r45lx\" (UID: \"01ff70b5-f62b-459e-b720-17945991b1d5\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" Apr 21 02:04:48.230364 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:48.230321 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/01ff70b5-f62b-459e-b720-17945991b1d5-proc\") pod \"perf-node-gather-daemonset-r45lx\" (UID: \"01ff70b5-f62b-459e-b720-17945991b1d5\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" Apr 21 02:04:48.230524 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:48.230442 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/01ff70b5-f62b-459e-b720-17945991b1d5-lib-modules\") pod \"perf-node-gather-daemonset-r45lx\" (UID: \"01ff70b5-f62b-459e-b720-17945991b1d5\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" Apr 21 02:04:48.237667 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:48.237636 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr999\" (UniqueName: \"kubernetes.io/projected/01ff70b5-f62b-459e-b720-17945991b1d5-kube-api-access-wr999\") pod \"perf-node-gather-daemonset-r45lx\" (UID: \"01ff70b5-f62b-459e-b720-17945991b1d5\") " pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" Apr 21 02:04:48.398189 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:48.398153 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" Apr 21 02:04:48.574835 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:48.569536 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx"] Apr 21 02:04:48.581107 ip-10-0-129-52 kubenswrapper[2573]: W0421 02:04:48.581059 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod01ff70b5_f62b_459e_b720_17945991b1d5.slice/crio-343321c905ff5c8686b8c464a4af9b1e813895cc1465e31b7768acf3e968353b WatchSource:0}: Error finding container 343321c905ff5c8686b8c464a4af9b1e813895cc1465e31b7768acf3e968353b: Status 404 returned error can't find the container with id 343321c905ff5c8686b8c464a4af9b1e813895cc1465e31b7768acf3e968353b Apr 21 02:04:49.582152 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:49.582108 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" event={"ID":"01ff70b5-f62b-459e-b720-17945991b1d5","Type":"ContainerStarted","Data":"b6dcbc22d07e9d004e89c35e7108c7c516e75a13453f36ca61c5e0f22b5f862d"} Apr 21 02:04:49.582152 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:49.582159 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" event={"ID":"01ff70b5-f62b-459e-b720-17945991b1d5","Type":"ContainerStarted","Data":"343321c905ff5c8686b8c464a4af9b1e813895cc1465e31b7768acf3e968353b"} Apr 21 02:04:49.582673 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:49.582306 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" Apr 21 02:04:49.598012 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:49.597949 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" podStartSLOduration=1.597929409 podStartE2EDuration="1.597929409s" podCreationTimestamp="2026-04-21 02:04:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 02:04:49.595732718 +0000 UTC m=+887.175034777" watchObservedRunningTime="2026-04-21 02:04:49.597929409 +0000 UTC m=+887.177231468" Apr 21 02:04:49.625026 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:49.624997 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w2qdq_8598fef8-2cf7-4d82-aa02-44eac46217af/dns/0.log" Apr 21 02:04:49.646941 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:49.646909 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-w2qdq_8598fef8-2cf7-4d82-aa02-44eac46217af/kube-rbac-proxy/0.log" Apr 21 02:04:49.708606 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:49.708582 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w4c4r_cc11073d-4e8d-4b5b-b1bd-40ad61aa03b0/dns-node-resolver/0.log" Apr 21 02:04:50.304952 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:50.304920 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xfl2x_6de78603-c646-45a5-8bc4-9cfc56456d0f/node-ca/0.log" Apr 21 02:04:51.125877 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:51.125844 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-56c874cd68-dzm28_530dc938-f98c-4492-a65a-20e3a9d4750c/kube-auth-proxy/0.log" Apr 21 02:04:51.199015 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:51.198986 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-65f9585684-glq78_70601297-057d-42c8-bef0-315d6797ccfd/router/0.log" Apr 21 02:04:51.718637 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:51.718611 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-r9t8h_ff1d28c8-cbb1-4385-9c36-da62f691590f/serve-healthcheck-canary/0.log" Apr 21 02:04:52.140092 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:52.140067 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6ghj7_2771f995-2e2b-48cb-a750-37d375badbcd/kube-rbac-proxy/0.log" Apr 21 02:04:52.157972 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:52.157943 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6ghj7_2771f995-2e2b-48cb-a750-37d375badbcd/exporter/0.log" Apr 21 02:04:52.176574 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:52.176550 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6ghj7_2771f995-2e2b-48cb-a750-37d375badbcd/extractor/0.log" Apr 21 02:04:54.181092 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:54.181066 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-55874489bb-5l46x_1c20f62f-3957-4098-8b95-fd58e30ca1cd/maas-api/0.log" Apr 21 02:04:54.333906 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:54.333881 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-64bbc69db5-jqbhx_a421cc19-79ad-41ba-8dfc-971995cc31a0/manager/0.log" Apr 21 02:04:55.559211 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:55.559180 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5bf8b8945f-qmpmb_97eee220-e43b-4185-a8b5-93170f8ceacd/manager/0.log" Apr 21 02:04:55.597495 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:04:55.597469 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-5xmbv/perf-node-gather-daemonset-r45lx" Apr 21 02:05:01.113303 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:01.113267 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8h5m5_7b9f858d-0cf6-49d9-8632-23d2d0584e24/kube-multus-additional-cni-plugins/0.log" Apr 21 02:05:01.132023 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:01.131991 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8h5m5_7b9f858d-0cf6-49d9-8632-23d2d0584e24/egress-router-binary-copy/0.log" Apr 21 02:05:01.152610 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:01.152583 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8h5m5_7b9f858d-0cf6-49d9-8632-23d2d0584e24/cni-plugins/0.log" Apr 21 02:05:01.171645 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:01.171620 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8h5m5_7b9f858d-0cf6-49d9-8632-23d2d0584e24/bond-cni-plugin/0.log" Apr 21 02:05:01.189648 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:01.189617 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8h5m5_7b9f858d-0cf6-49d9-8632-23d2d0584e24/routeoverride-cni/0.log" Apr 21 02:05:01.208907 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:01.208882 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8h5m5_7b9f858d-0cf6-49d9-8632-23d2d0584e24/whereabouts-cni-bincopy/0.log" Apr 21 02:05:01.244699 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:01.244666 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8h5m5_7b9f858d-0cf6-49d9-8632-23d2d0584e24/whereabouts-cni/0.log" Apr 21 02:05:01.638483 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:01.638452 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-fdckn_214bcf67-a154-4c72-a914-d9efa8bdfee9/kube-multus/0.log" Apr 21 02:05:01.720632 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:01.720607 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mfs4c_9c103689-40cc-470b-9109-33a63ff6f5dd/network-metrics-daemon/0.log" Apr 21 02:05:01.738000 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:01.737963 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mfs4c_9c103689-40cc-470b-9109-33a63ff6f5dd/kube-rbac-proxy/0.log" Apr 21 02:05:03.017518 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:03.017490 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2tdx_ed377958-ce5b-41c7-9512-4b95b799767d/ovn-controller/0.log" Apr 21 02:05:03.020806 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:03.020785 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2tdx_ed377958-ce5b-41c7-9512-4b95b799767d/ovn-acl-logging/0.log" Apr 21 02:05:03.023094 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:03.023075 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2tdx_ed377958-ce5b-41c7-9512-4b95b799767d/ovn-acl-logging/0.log" Apr 21 02:05:03.032187 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:03.032165 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2tdx_ed377958-ce5b-41c7-9512-4b95b799767d/ovn-acl-logging/0.log" Apr 21 02:05:03.037719 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:03.037701 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2tdx_ed377958-ce5b-41c7-9512-4b95b799767d/ovn-acl-logging/1.log" Apr 21 02:05:03.058767 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:03.058739 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2tdx_ed377958-ce5b-41c7-9512-4b95b799767d/kube-rbac-proxy-node/0.log" Apr 21 02:05:03.077927 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:03.077895 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2tdx_ed377958-ce5b-41c7-9512-4b95b799767d/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 02:05:03.095166 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:03.095141 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2tdx_ed377958-ce5b-41c7-9512-4b95b799767d/northd/0.log" Apr 21 02:05:03.113919 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:03.113891 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2tdx_ed377958-ce5b-41c7-9512-4b95b799767d/nbdb/0.log" Apr 21 02:05:03.132646 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:03.132616 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2tdx_ed377958-ce5b-41c7-9512-4b95b799767d/sbdb/0.log" Apr 21 02:05:03.251446 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:03.251415 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f2tdx_ed377958-ce5b-41c7-9512-4b95b799767d/ovnkube-controller/0.log" Apr 21 02:05:04.303518 ip-10-0-129-52 kubenswrapper[2573]: I0421 02:05:04.303492 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-bvqj7_560db137-e262-4c6c-9380-c422a8537e5e/network-check-target-container/0.log"