Apr 20 07:50:02.722920 ip-10-0-129-24 systemd[1]: Starting Kubernetes Kubelet... Apr 20 07:50:03.111466 ip-10-0-129-24 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 07:50:03.111466 ip-10-0-129-24 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 07:50:03.111466 ip-10-0-129-24 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 07:50:03.111466 ip-10-0-129-24 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 07:50:03.111466 ip-10-0-129-24 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 07:50:03.113941 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.113842 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 07:50:03.119446 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119420 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:50:03.119446 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119442 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:50:03.119446 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119446 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:50:03.119446 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119449 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:50:03.119446 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119452 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:50:03.119672 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119457 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:50:03.119672 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119460 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:50:03.119672 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119464 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:50:03.119672 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119467 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:50:03.119672 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119470 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:50:03.119672 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119473 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:50:03.119672 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119476 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:50:03.119672 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119478 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:50:03.119672 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119481 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:50:03.119672 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119485 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:50:03.119672 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119487 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:50:03.119672 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119490 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:50:03.119672 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119492 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:50:03.119672 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119495 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:50:03.119672 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119497 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:50:03.119672 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119500 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:50:03.119672 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119503 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:50:03.119672 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119506 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:50:03.119672 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119509 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:50:03.119672 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119511 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:50:03.120140 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119514 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:50:03.120140 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119516 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:50:03.120140 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119522 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:50:03.120140 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119525 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:50:03.120140 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119529 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:50:03.120140 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119531 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:50:03.120140 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119535 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:50:03.120140 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119540 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:50:03.120140 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119542 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:50:03.120140 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119545 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:50:03.120140 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119548 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:50:03.120140 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119550 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:50:03.120140 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119553 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:50:03.120140 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119557 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:50:03.120140 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119560 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:50:03.120140 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119563 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:50:03.120140 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119566 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:50:03.120140 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119569 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:50:03.120140 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119571 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:50:03.120605 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119574 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:50:03.120605 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119578 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:50:03.120605 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119581 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:50:03.120605 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119583 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:50:03.120605 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119586 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:50:03.120605 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119588 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:50:03.120605 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119591 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:50:03.120605 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119593 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:50:03.120605 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119596 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:50:03.120605 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119598 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:50:03.120605 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119600 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:50:03.120605 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119603 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:50:03.120605 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119605 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:50:03.120605 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119608 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:50:03.120605 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119627 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:50:03.120605 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119631 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:50:03.120605 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119633 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:50:03.120605 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119636 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:50:03.120605 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119638 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:50:03.120605 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119641 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:50:03.121111 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119643 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:50:03.121111 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119646 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:50:03.121111 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119648 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:50:03.121111 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119652 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:50:03.121111 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119654 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:50:03.121111 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119657 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:50:03.121111 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119660 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:50:03.121111 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119663 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:50:03.121111 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119666 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:50:03.121111 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119669 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:50:03.121111 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119671 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:50:03.121111 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119674 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:50:03.121111 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119677 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:50:03.121111 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119679 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:50:03.121111 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119683 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:50:03.121111 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119686 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:50:03.121111 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119689 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:50:03.121111 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119692 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:50:03.121111 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119696 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:50:03.121111 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119699 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:50:03.121592 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119701 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:50:03.121592 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.119705 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:50:03.121592 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120100 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:50:03.121592 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120105 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:50:03.121592 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120108 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:50:03.121592 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120111 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:50:03.121592 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120114 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:50:03.121592 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120116 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:50:03.121592 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120119 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:50:03.121592 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120121 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:50:03.121592 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120124 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:50:03.121592 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120127 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:50:03.121592 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120129 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:50:03.121592 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120132 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:50:03.121592 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120134 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:50:03.121592 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120137 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:50:03.121592 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120140 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:50:03.121592 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120143 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:50:03.121592 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120146 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:50:03.121592 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120149 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:50:03.122084 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120153 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:50:03.122084 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120155 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:50:03.122084 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120158 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:50:03.122084 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120160 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:50:03.122084 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120163 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:50:03.122084 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120165 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:50:03.122084 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120168 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:50:03.122084 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120171 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:50:03.122084 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120173 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:50:03.122084 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120176 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:50:03.122084 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120179 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:50:03.122084 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120181 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:50:03.122084 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120184 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:50:03.122084 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120187 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:50:03.122084 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120189 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:50:03.122084 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120192 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:50:03.122084 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120194 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:50:03.122084 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120197 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:50:03.122084 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120200 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:50:03.122084 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120202 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:50:03.122647 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120205 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:50:03.122647 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120208 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:50:03.122647 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120210 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:50:03.122647 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120213 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:50:03.122647 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120215 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:50:03.122647 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120218 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:50:03.122647 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120220 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:50:03.122647 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120225 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:50:03.122647 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120227 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:50:03.122647 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120230 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:50:03.122647 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120233 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:50:03.122647 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120236 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:50:03.122647 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120239 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:50:03.122647 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120241 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:50:03.122647 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120244 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:50:03.122647 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120247 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:50:03.122647 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120249 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:50:03.122647 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120252 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:50:03.122647 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120254 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:50:03.122647 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120257 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:50:03.123138 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120260 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:50:03.123138 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120262 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:50:03.123138 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120264 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:50:03.123138 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120267 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:50:03.123138 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120269 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:50:03.123138 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120272 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:50:03.123138 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120275 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:50:03.123138 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120277 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:50:03.123138 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120281 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:50:03.123138 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120284 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:50:03.123138 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120287 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:50:03.123138 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120290 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:50:03.123138 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120293 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:50:03.123138 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120295 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:50:03.123138 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120299 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:50:03.123138 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120302 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:50:03.123138 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120305 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:50:03.123138 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120307 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:50:03.123138 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120310 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:50:03.123597 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120312 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:50:03.123597 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120315 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:50:03.123597 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120319 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:50:03.123597 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120323 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:50:03.123597 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120325 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:50:03.123597 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120328 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:50:03.123597 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120331 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:50:03.123597 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120333 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:50:03.123597 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.120336 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:50:03.123597 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121504 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 07:50:03.123597 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121514 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 07:50:03.123597 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121521 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 07:50:03.123597 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121526 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 07:50:03.123597 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121532 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 07:50:03.123597 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121536 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 07:50:03.123597 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121541 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 07:50:03.123597 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121545 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 07:50:03.123597 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121549 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 07:50:03.123597 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121552 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 07:50:03.123597 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121556 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121559 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121562 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121565 2575 flags.go:64] FLAG: --cgroup-root="" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121569 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121572 2575 flags.go:64] FLAG: --client-ca-file="" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121575 2575 flags.go:64] FLAG: --cloud-config="" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121578 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121581 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121586 2575 flags.go:64] FLAG: --cluster-domain="" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121589 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121593 2575 flags.go:64] FLAG: --config-dir="" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121596 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121599 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121604 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121607 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121625 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121630 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121633 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121636 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121640 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121643 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121646 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121651 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121654 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 07:50:03.124103 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121657 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121660 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121663 2575 flags.go:64] FLAG: --enable-server="true" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121666 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121671 2575 flags.go:64] FLAG: --event-burst="100" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121674 2575 flags.go:64] FLAG: --event-qps="50" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121677 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121680 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121684 2575 flags.go:64] FLAG: --eviction-hard="" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121688 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121691 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121694 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121698 2575 flags.go:64] FLAG: --eviction-soft="" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121701 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121704 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121707 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121710 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121713 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121716 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121719 2575 flags.go:64] FLAG: --feature-gates="" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121723 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121726 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121729 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121732 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121735 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 20 07:50:03.124721 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121739 2575 flags.go:64] FLAG: --help="false" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121742 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-129-24.ec2.internal" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121745 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121748 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121751 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121754 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121758 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121761 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121764 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121767 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121770 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121773 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121777 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121780 2575 flags.go:64] FLAG: --kube-reserved="" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121783 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121785 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121788 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121791 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121795 2575 flags.go:64] FLAG: --lock-file="" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121798 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121801 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121803 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121809 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 07:50:03.125337 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121812 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121815 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121818 2575 flags.go:64] FLAG: --logging-format="text" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121821 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121824 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121827 2575 flags.go:64] FLAG: --manifest-url="" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121830 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121834 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121838 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121843 2575 flags.go:64] FLAG: --max-pods="110" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121846 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121849 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121852 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121855 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121858 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121861 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121864 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121871 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121874 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121878 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121881 2575 flags.go:64] FLAG: --pod-cidr="" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121885 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121891 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121893 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 07:50:03.125894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121897 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121900 2575 flags.go:64] FLAG: --port="10250" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121903 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121906 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b8276745ef207e8a" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121910 2575 flags.go:64] FLAG: --qos-reserved="" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121913 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121916 2575 flags.go:64] FLAG: --register-node="true" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121919 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121922 2575 flags.go:64] FLAG: --register-with-taints="" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121926 2575 flags.go:64] FLAG: --registry-burst="10" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121929 2575 flags.go:64] FLAG: --registry-qps="5" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121932 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121935 2575 flags.go:64] FLAG: --reserved-memory="" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121939 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121942 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121945 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121948 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121951 2575 flags.go:64] FLAG: --runonce="false" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121953 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121957 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121960 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121963 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121965 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121968 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121972 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121975 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 07:50:03.126464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121978 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121980 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121983 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121987 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121990 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121993 2575 flags.go:64] FLAG: --system-cgroups="" Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.121996 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.122002 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.122005 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.122008 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.122013 2575 flags.go:64] FLAG: --tls-min-version="" Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.122015 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.122018 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.122022 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.122025 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.122028 2575 flags.go:64] FLAG: --v="2" Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.122032 2575 flags.go:64] FLAG: --version="false" Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.122036 2575 flags.go:64] FLAG: --vmodule="" Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.122040 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.122043 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122140 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122144 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122148 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122151 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:50:03.127136 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122154 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:50:03.127836 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122157 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:50:03.127836 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122160 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:50:03.127836 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122163 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:50:03.127836 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122166 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:50:03.127836 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122168 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:50:03.127836 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122171 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:50:03.127836 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122173 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:50:03.127836 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122176 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:50:03.127836 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122178 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:50:03.127836 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122181 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:50:03.127836 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122183 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:50:03.127836 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122187 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:50:03.127836 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122190 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:50:03.127836 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122192 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:50:03.127836 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122196 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:50:03.127836 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122200 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:50:03.127836 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122203 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:50:03.127836 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122206 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:50:03.127836 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122208 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:50:03.128359 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122211 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:50:03.128359 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122214 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:50:03.128359 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122216 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:50:03.128359 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122219 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:50:03.128359 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122221 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:50:03.128359 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122224 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:50:03.128359 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122226 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:50:03.128359 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122229 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:50:03.128359 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122232 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:50:03.128359 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122234 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:50:03.128359 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122237 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:50:03.128359 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122239 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:50:03.128359 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122242 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:50:03.128359 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122244 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:50:03.128359 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122246 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:50:03.128359 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122249 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:50:03.128359 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122251 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:50:03.128359 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122254 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:50:03.128359 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122256 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:50:03.128359 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122259 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:50:03.128925 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122261 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:50:03.128925 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122264 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:50:03.128925 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122266 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:50:03.128925 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122272 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:50:03.128925 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122275 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:50:03.128925 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122279 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:50:03.128925 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122281 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:50:03.128925 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122287 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:50:03.128925 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122290 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:50:03.128925 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122292 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:50:03.128925 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122295 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:50:03.128925 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122297 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:50:03.128925 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122300 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:50:03.128925 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122303 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:50:03.128925 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122306 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:50:03.128925 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122308 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:50:03.128925 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122311 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:50:03.128925 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122314 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:50:03.128925 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122316 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:50:03.128925 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122319 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:50:03.129427 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122322 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:50:03.129427 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122324 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:50:03.129427 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122327 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:50:03.129427 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122329 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:50:03.129427 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122332 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:50:03.129427 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122334 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:50:03.129427 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122337 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:50:03.129427 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122340 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:50:03.129427 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122342 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:50:03.129427 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122345 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:50:03.129427 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122349 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:50:03.129427 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122352 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:50:03.129427 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122355 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:50:03.129427 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122358 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:50:03.129427 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122361 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:50:03.129427 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122365 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:50:03.129427 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122368 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:50:03.129427 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122371 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:50:03.129427 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122374 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:50:03.129912 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122378 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:50:03.129912 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122381 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:50:03.129912 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.122383 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:50:03.129912 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.123460 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 07:50:03.130191 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.130170 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 07:50:03.130224 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.130192 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 07:50:03.130254 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130242 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:50:03.130254 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130247 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:50:03.130254 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130251 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:50:03.130254 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130254 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:50:03.130358 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130258 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:50:03.130358 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130261 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:50:03.130358 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130264 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:50:03.130358 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130268 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:50:03.130358 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130270 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:50:03.130358 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130274 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:50:03.130358 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130277 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:50:03.130358 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130279 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:50:03.130358 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130282 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:50:03.130358 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130285 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:50:03.130358 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130287 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:50:03.130358 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130290 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:50:03.130358 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130293 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:50:03.130358 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130296 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:50:03.130358 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130298 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:50:03.130358 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130302 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:50:03.130358 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130304 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:50:03.130358 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130307 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:50:03.130358 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130310 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:50:03.130358 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130312 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:50:03.130900 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130315 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:50:03.130900 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130318 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:50:03.130900 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130320 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:50:03.130900 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130323 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:50:03.130900 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130325 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:50:03.130900 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130328 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:50:03.130900 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130331 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:50:03.130900 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130335 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:50:03.130900 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130338 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:50:03.130900 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130340 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:50:03.130900 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130343 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:50:03.130900 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130345 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:50:03.130900 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130348 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:50:03.130900 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130351 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:50:03.130900 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130354 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:50:03.130900 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130357 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:50:03.130900 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130359 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:50:03.130900 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130363 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:50:03.130900 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130365 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:50:03.130900 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130368 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:50:03.131398 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130371 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:50:03.131398 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130375 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:50:03.131398 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130379 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:50:03.131398 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130382 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:50:03.131398 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130385 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:50:03.131398 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130388 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:50:03.131398 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130390 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:50:03.131398 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130393 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:50:03.131398 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130396 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:50:03.131398 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130399 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:50:03.131398 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130401 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:50:03.131398 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130404 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:50:03.131398 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130408 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:50:03.131398 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130412 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:50:03.131398 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130415 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:50:03.131398 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130417 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:50:03.131398 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130420 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:50:03.131398 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130423 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:50:03.131398 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130425 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:50:03.131887 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130428 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:50:03.131887 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130431 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:50:03.131887 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130434 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:50:03.131887 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130436 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:50:03.131887 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130439 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:50:03.131887 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130441 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:50:03.131887 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130444 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:50:03.131887 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130447 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:50:03.131887 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130450 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:50:03.131887 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130453 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:50:03.131887 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130456 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:50:03.131887 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130458 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:50:03.131887 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130461 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:50:03.131887 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130464 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:50:03.131887 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130466 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:50:03.131887 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130469 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:50:03.131887 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130472 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:50:03.131887 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130474 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:50:03.131887 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130477 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:50:03.131887 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130480 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:50:03.132380 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130482 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:50:03.132380 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130485 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:50:03.132380 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130487 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:50:03.132380 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.130493 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 07:50:03.132380 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130629 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:50:03.132380 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130635 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:50:03.132380 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130638 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:50:03.132380 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130641 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:50:03.132380 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130644 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:50:03.132380 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130647 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:50:03.132380 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130651 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:50:03.132380 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130655 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:50:03.132380 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130658 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:50:03.132380 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130660 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:50:03.132380 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130664 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:50:03.132764 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130666 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:50:03.132764 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130669 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:50:03.132764 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130672 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:50:03.132764 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130675 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:50:03.132764 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130678 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:50:03.132764 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130681 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:50:03.132764 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130684 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:50:03.132764 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130686 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:50:03.132764 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130689 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:50:03.132764 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130692 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:50:03.132764 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130694 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:50:03.132764 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130697 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:50:03.132764 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130699 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:50:03.132764 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130702 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:50:03.132764 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130704 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:50:03.132764 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130707 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:50:03.132764 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130710 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:50:03.132764 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130713 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:50:03.132764 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130715 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:50:03.133245 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130718 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:50:03.133245 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130721 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:50:03.133245 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130723 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:50:03.133245 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130726 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:50:03.133245 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130728 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:50:03.133245 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130731 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:50:03.133245 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130733 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:50:03.133245 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130736 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:50:03.133245 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130739 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:50:03.133245 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130741 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:50:03.133245 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130744 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:50:03.133245 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130746 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:50:03.133245 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130749 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:50:03.133245 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130755 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:50:03.133245 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130757 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:50:03.133245 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130760 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:50:03.133245 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130762 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:50:03.133245 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130765 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:50:03.133245 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130768 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:50:03.133245 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130770 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:50:03.133751 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130773 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:50:03.133751 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130776 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:50:03.133751 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130779 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:50:03.133751 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130782 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:50:03.133751 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130784 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:50:03.133751 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130787 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:50:03.133751 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130789 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:50:03.133751 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130792 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:50:03.133751 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130796 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:50:03.133751 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130799 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:50:03.133751 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130802 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:50:03.133751 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130805 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:50:03.133751 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130808 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:50:03.133751 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130811 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:50:03.133751 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130814 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:50:03.133751 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130816 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:50:03.133751 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130819 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:50:03.133751 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130821 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:50:03.133751 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130824 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:50:03.133751 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130826 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:50:03.134222 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130829 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:50:03.134222 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130831 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:50:03.134222 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130834 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:50:03.134222 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130837 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:50:03.134222 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130839 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:50:03.134222 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130841 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:50:03.134222 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130847 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:50:03.134222 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130850 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:50:03.134222 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130852 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:50:03.134222 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130855 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:50:03.134222 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130858 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:50:03.134222 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130861 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:50:03.134222 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130863 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:50:03.134222 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130866 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:50:03.134222 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130868 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:50:03.134222 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:03.130871 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:50:03.134635 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.130876 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 07:50:03.134635 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.131576 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 07:50:03.135599 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.135583 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 07:50:03.136447 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.136434 2575 server.go:1019] "Starting client certificate rotation" Apr 20 07:50:03.136552 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.136534 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 07:50:03.136607 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.136589 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 07:50:03.158073 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.158038 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 07:50:03.161579 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.161456 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 07:50:03.175887 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.175861 2575 log.go:25] "Validated CRI v1 runtime API" Apr 20 07:50:03.181554 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.181535 2575 log.go:25] "Validated CRI v1 image API" Apr 20 07:50:03.182726 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.182707 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 07:50:03.185004 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.184975 2575 fs.go:135] Filesystem UUIDs: map[134ef21b-cf53-4c6b-b2ff-1cc17b9fa920:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 b714ded8-7be8-4698-a659-da4f7af6016b:/dev/nvme0n1p4] Apr 20 07:50:03.185085 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.185002 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 07:50:03.186787 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.186765 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 07:50:03.191045 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.190916 2575 manager.go:217] Machine: {Timestamp:2026-04-20 07:50:03.189118263 +0000 UTC m=+0.362445684 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100038 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2550f8511579c3f08ef4dc2725580f SystemUUID:ec2550f8-5115-79c3-f08e-f4dc2725580f BootID:e17dcb41-dd87-434a-ade2-d991851cba3d Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:55:a3:1a:ae:2f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:55:a3:1a:ae:2f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:72:50:e9:15:1d:50 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 07:50:03.191045 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.191039 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 07:50:03.191154 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.191133 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 07:50:03.193474 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.193446 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 07:50:03.193626 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.193476 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-24.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 07:50:03.193677 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.193635 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 07:50:03.193677 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.193645 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 07:50:03.193677 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.193659 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 07:50:03.194274 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.194263 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 07:50:03.195410 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.195398 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 20 07:50:03.195525 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.195516 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 07:50:03.197854 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.197842 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 20 07:50:03.197902 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.197858 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 07:50:03.197902 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.197873 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 07:50:03.197902 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.197883 2575 kubelet.go:397] "Adding apiserver pod source" Apr 20 07:50:03.197902 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.197901 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 07:50:03.198910 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.198897 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 07:50:03.198955 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.198917 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 07:50:03.201714 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.201687 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 07:50:03.207280 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.207258 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 07:50:03.209676 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.209659 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 07:50:03.209676 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.209679 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 07:50:03.209800 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.209685 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 07:50:03.209800 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.209691 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 07:50:03.209800 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.209697 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 07:50:03.209800 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.209703 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 07:50:03.209800 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.209709 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 07:50:03.209800 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.209714 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 07:50:03.209800 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.209723 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 07:50:03.209800 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.209729 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 07:50:03.209800 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.209737 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 07:50:03.209800 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.209746 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 07:50:03.210913 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.210900 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 07:50:03.210913 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.210914 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 07:50:03.211924 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:03.211899 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 07:50:03.212001 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:03.211932 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-24.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 07:50:03.214700 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.214686 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 07:50:03.214754 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.214725 2575 server.go:1295] "Started kubelet" Apr 20 07:50:03.214848 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.214805 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 07:50:03.214848 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.214828 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 07:50:03.215009 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.214862 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 07:50:03.215657 ip-10-0-129-24 systemd[1]: Started Kubernetes Kubelet. Apr 20 07:50:03.215883 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.215864 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 07:50:03.217400 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.217379 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 20 07:50:03.219412 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.219394 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-24.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 07:50:03.219989 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:03.219244 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-24.ec2.internal.18a8013a5f34ffca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-24.ec2.internal,UID:ip-10-0-129-24.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-24.ec2.internal,},FirstTimestamp:2026-04-20 07:50:03.21470049 +0000 UTC m=+0.388027911,LastTimestamp:2026-04-20 07:50:03.21470049 +0000 UTC m=+0.388027911,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-24.ec2.internal,}" Apr 20 07:50:03.221258 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.221239 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 07:50:03.221714 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.221690 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 07:50:03.222233 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.222203 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 07:50:03.222331 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.222248 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 07:50:03.222331 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.222300 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 07:50:03.222331 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:03.222278 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 07:50:03.222484 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.222384 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 20 07:50:03.222484 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.222391 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 20 07:50:03.222484 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:03.222465 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-24.ec2.internal\" not found" Apr 20 07:50:03.222693 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.222678 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 07:50:03.222693 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.222694 2575 factory.go:55] Registering systemd factory Apr 20 07:50:03.222794 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.222704 2575 factory.go:223] Registration of the systemd container factory successfully Apr 20 07:50:03.222904 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.222892 2575 factory.go:153] Registering CRI-O factory Apr 20 07:50:03.222956 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.222907 2575 factory.go:223] Registration of the crio container factory successfully Apr 20 07:50:03.222956 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.222921 2575 factory.go:103] Registering Raw factory Apr 20 07:50:03.222956 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.222932 2575 manager.go:1196] Started watching for new ooms in manager Apr 20 07:50:03.223338 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.223326 2575 manager.go:319] Starting recovery of all containers Apr 20 07:50:03.223738 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.223718 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-thrkt" Apr 20 07:50:03.231646 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:03.231410 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-24.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 07:50:03.231646 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.231459 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-thrkt" Apr 20 07:50:03.231646 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:03.231634 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 07:50:03.237515 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.237496 2575 manager.go:324] Recovery completed Apr 20 07:50:03.241802 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.241789 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:50:03.244310 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.244294 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-24.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:50:03.244385 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.244322 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-24.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:50:03.244385 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.244334 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-24.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:50:03.244905 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.244891 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 07:50:03.244905 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.244903 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 07:50:03.245021 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.244918 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 20 07:50:03.246649 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:03.246555 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-24.ec2.internal.18a8013a60f8c780 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-24.ec2.internal,UID:ip-10-0-129-24.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-24.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-24.ec2.internal,},FirstTimestamp:2026-04-20 07:50:03.244308352 +0000 UTC m=+0.417635772,LastTimestamp:2026-04-20 07:50:03.244308352 +0000 UTC m=+0.417635772,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-24.ec2.internal,}" Apr 20 07:50:03.248028 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.248015 2575 policy_none.go:49] "None policy: Start" Apr 20 07:50:03.248076 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.248042 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 07:50:03.248076 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.248053 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 20 07:50:03.301461 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.282332 2575 manager.go:341] "Starting Device Plugin manager" Apr 20 07:50:03.301461 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:03.282374 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 07:50:03.301461 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.282388 2575 server.go:85] "Starting device plugin registration server" Apr 20 07:50:03.301461 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.282701 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 07:50:03.301461 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.282716 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 07:50:03.301461 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.282816 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 07:50:03.301461 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.282960 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 07:50:03.301461 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.282968 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 07:50:03.301461 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:03.283538 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 07:50:03.301461 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:03.283568 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-24.ec2.internal\" not found" Apr 20 07:50:03.301461 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.284342 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 07:50:03.301461 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.285484 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 07:50:03.301461 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.285506 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 07:50:03.301461 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.285523 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 07:50:03.301461 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.285537 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 07:50:03.301461 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:03.285567 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 07:50:03.301461 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.288105 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:50:03.383810 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.383787 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:50:03.385767 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.385738 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-24.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-24.ec2.internal"] Apr 20 07:50:03.385881 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.385814 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:50:03.386088 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.386069 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-24.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:50:03.386168 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.386105 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-24.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:50:03.386168 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.386123 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-24.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:50:03.386168 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.386151 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-24.ec2.internal" Apr 20 07:50:03.387340 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.387324 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-24.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:50:03.387408 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.387352 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-24.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:50:03.387408 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.387362 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-24.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:50:03.389603 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.389589 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:50:03.389738 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.389724 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-24.ec2.internal" Apr 20 07:50:03.389779 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.389754 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:50:03.390414 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.390398 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-24.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:50:03.390470 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.390424 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-24.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:50:03.390470 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.390435 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-24.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:50:03.390559 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.390402 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-24.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:50:03.390559 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.390510 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-24.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:50:03.390559 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.390528 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-24.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:50:03.392584 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.392568 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-24.ec2.internal" Apr 20 07:50:03.392690 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:03.392592 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-24.ec2.internal\": node \"ip-10-0-129-24.ec2.internal\" not found" Apr 20 07:50:03.392690 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.392664 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-24.ec2.internal" Apr 20 07:50:03.392690 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.392689 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:50:03.393419 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.393405 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-24.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:50:03.393477 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.393430 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-24.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:50:03.393477 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.393442 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-24.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:50:03.405391 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:03.405370 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-24.ec2.internal\" not found" Apr 20 07:50:03.406233 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:03.406213 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-24.ec2.internal\" not found" node="ip-10-0-129-24.ec2.internal" Apr 20 07:50:03.409535 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:03.409518 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-24.ec2.internal\" not found" node="ip-10-0-129-24.ec2.internal" Apr 20 07:50:03.423113 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.423087 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5018c020cf992df6097e0aee42f16bf3-config\") pod \"kube-apiserver-proxy-ip-10-0-129-24.ec2.internal\" (UID: \"5018c020cf992df6097e0aee42f16bf3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-24.ec2.internal" Apr 20 07:50:03.423198 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.423115 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2a8554c547a781f4277e2efb4f0aafc4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-24.ec2.internal\" (UID: \"2a8554c547a781f4277e2efb4f0aafc4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-24.ec2.internal" Apr 20 07:50:03.423198 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.423133 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a8554c547a781f4277e2efb4f0aafc4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-24.ec2.internal\" (UID: \"2a8554c547a781f4277e2efb4f0aafc4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-24.ec2.internal" Apr 20 07:50:03.506387 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:03.506355 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-24.ec2.internal\" not found" Apr 20 07:50:03.523699 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.523669 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2a8554c547a781f4277e2efb4f0aafc4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-24.ec2.internal\" (UID: \"2a8554c547a781f4277e2efb4f0aafc4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-24.ec2.internal" Apr 20 07:50:03.523820 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.523708 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a8554c547a781f4277e2efb4f0aafc4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-24.ec2.internal\" (UID: \"2a8554c547a781f4277e2efb4f0aafc4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-24.ec2.internal" Apr 20 07:50:03.523820 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.523732 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5018c020cf992df6097e0aee42f16bf3-config\") pod \"kube-apiserver-proxy-ip-10-0-129-24.ec2.internal\" (UID: \"5018c020cf992df6097e0aee42f16bf3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-24.ec2.internal" Apr 20 07:50:03.523820 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.523765 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5018c020cf992df6097e0aee42f16bf3-config\") pod \"kube-apiserver-proxy-ip-10-0-129-24.ec2.internal\" (UID: \"5018c020cf992df6097e0aee42f16bf3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-24.ec2.internal" Apr 20 07:50:03.523820 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.523773 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a8554c547a781f4277e2efb4f0aafc4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-24.ec2.internal\" (UID: \"2a8554c547a781f4277e2efb4f0aafc4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-24.ec2.internal" Apr 20 07:50:03.523820 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.523773 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2a8554c547a781f4277e2efb4f0aafc4-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-24.ec2.internal\" (UID: \"2a8554c547a781f4277e2efb4f0aafc4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-24.ec2.internal" Apr 20 07:50:03.606826 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:03.606785 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-24.ec2.internal\" not found" Apr 20 07:50:03.707573 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:03.707494 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-24.ec2.internal\" not found" Apr 20 07:50:03.707573 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.707527 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-24.ec2.internal" Apr 20 07:50:03.712059 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:03.712031 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-24.ec2.internal" Apr 20 07:50:03.807891 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:03.807848 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-24.ec2.internal\" not found" Apr 20 07:50:03.908469 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:03.908419 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-24.ec2.internal\" not found" Apr 20 07:50:04.009012 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:04.008927 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-24.ec2.internal\" not found" Apr 20 07:50:04.109559 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:04.109519 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-24.ec2.internal\" not found" Apr 20 07:50:04.135962 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:04.135942 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 07:50:04.136590 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:04.136078 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 07:50:04.210568 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:04.210537 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-24.ec2.internal\" not found" Apr 20 07:50:04.221423 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:04.221397 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 07:50:04.233776 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:04.233728 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 07:45:03 +0000 UTC" deadline="2027-10-21 14:48:57.05651784 +0000 UTC" Apr 20 07:50:04.233776 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:04.233774 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13182h58m52.822748702s" Apr 20 07:50:04.238575 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:04.238554 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 07:50:04.258871 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:04.258846 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-x2thr" Apr 20 07:50:04.267046 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:04.266970 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-x2thr" Apr 20 07:50:04.310691 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:04.310660 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-24.ec2.internal\" not found" Apr 20 07:50:04.376360 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:04.376331 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5018c020cf992df6097e0aee42f16bf3.slice/crio-c5e310557dc89151ca48e89c67c3ad1f80fa6b026f9f17623a87ba6a24fd7990 WatchSource:0}: Error finding container c5e310557dc89151ca48e89c67c3ad1f80fa6b026f9f17623a87ba6a24fd7990: Status 404 returned error can't find the container with id c5e310557dc89151ca48e89c67c3ad1f80fa6b026f9f17623a87ba6a24fd7990 Apr 20 07:50:04.376576 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:04.376565 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a8554c547a781f4277e2efb4f0aafc4.slice/crio-b4e6e70722cb3e255551e8d102c2dd634c594912eeb929020f74c7e2a9688549 WatchSource:0}: Error finding container b4e6e70722cb3e255551e8d102c2dd634c594912eeb929020f74c7e2a9688549: Status 404 returned error can't find the container with id b4e6e70722cb3e255551e8d102c2dd634c594912eeb929020f74c7e2a9688549 Apr 20 07:50:04.380345 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:04.380320 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 07:50:04.411807 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:04.411761 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-24.ec2.internal\" not found" Apr 20 07:50:04.468971 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:04.468941 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:50:04.469957 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:04.469935 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:50:04.522027 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:04.521941 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-24.ec2.internal" Apr 20 07:50:04.535119 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:04.535089 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 07:50:04.536048 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:04.536032 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-24.ec2.internal" Apr 20 07:50:04.544200 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:04.544180 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 07:50:04.744739 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:04.744709 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:50:05.199215 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.199173 2575 apiserver.go:52] "Watching apiserver" Apr 20 07:50:05.207866 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.207227 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 07:50:05.207866 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.207783 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-459v4","openshift-cluster-node-tuning-operator/tuned-gt9w2","openshift-dns/node-resolver-z54dw","openshift-image-registry/node-ca-n9944","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-24.ec2.internal","openshift-multus/multus-additional-cni-plugins-hzzk5","openshift-multus/network-metrics-daemon-7qpdh","openshift-ovn-kubernetes/ovnkube-node-crn8d","kube-system/konnectivity-agent-ggtd4","kube-system/kube-apiserver-proxy-ip-10-0-129-24.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4","openshift-multus/multus-7r2r4","openshift-network-diagnostics/network-check-target-sfl6t"] Apr 20 07:50:05.211451 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.211335 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:05.211451 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:05.211416 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qpdh" podUID="46b8deca-b33b-45e1-9131-b47fde192a78" Apr 20 07:50:05.215699 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.215672 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.218172 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.218136 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 07:50:05.218351 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.218332 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-ssw4t\"" Apr 20 07:50:05.218585 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.218569 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:50:05.218863 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.218834 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-z54dw" Apr 20 07:50:05.220974 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.220944 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 07:50:05.221693 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.221271 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n9944" Apr 20 07:50:05.221693 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.221328 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-m258s\"" Apr 20 07:50:05.221693 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.221400 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 07:50:05.223493 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.223470 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 07:50:05.223493 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.223491 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 07:50:05.223726 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.223523 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.223726 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.223655 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-459v4" Apr 20 07:50:05.224012 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.223992 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 07:50:05.224115 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.224093 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5fsx7\"" Apr 20 07:50:05.227037 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.227017 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 07:50:05.227156 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.227068 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 07:50:05.227156 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.227086 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-tx4bb\"" Apr 20 07:50:05.227156 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.227019 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 07:50:05.227308 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.227196 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 07:50:05.227308 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.227108 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-4grwr\"" Apr 20 07:50:05.227403 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.227364 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 07:50:05.227838 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.227774 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:50:05.227838 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.227815 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 07:50:05.227987 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.227818 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 07:50:05.228509 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.228494 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.230806 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.230782 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 07:50:05.231440 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.231231 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sj9vt\"" Apr 20 07:50:05.231440 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.231324 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ggtd4" Apr 20 07:50:05.231440 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.231405 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 07:50:05.232873 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.232518 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 07:50:05.233803 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.233778 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 07:50:05.233893 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.233872 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 07:50:05.234360 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.234340 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 07:50:05.234446 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.234397 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 07:50:05.234663 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.234644 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-etc-sysctl-d\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.234752 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.234688 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-lib-modules\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.234752 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.234725 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbfr6\" (UniqueName: \"kubernetes.io/projected/14afe6ef-f671-4736-ba3f-ac6236c30291-kube-api-access-hbfr6\") pod \"node-resolver-z54dw\" (UID: \"14afe6ef-f671-4736-ba3f-ac6236c30291\") " pod="openshift-dns/node-resolver-z54dw" Apr 20 07:50:05.234852 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.234758 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39-host\") pod \"node-ca-n9944\" (UID: \"b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39\") " pod="openshift-image-registry/node-ca-n9944" Apr 20 07:50:05.234852 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.234794 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ba7d0588-c3eb-4849-ae3b-630be7fcc621-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.234852 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.234827 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-etc-systemd\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.234996 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.234857 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-var-lib-kubelet\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.234996 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.234887 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24kmc\" (UniqueName: \"kubernetes.io/projected/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-kube-api-access-24kmc\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.234996 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.234933 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f8898642-a099-48c6-ba9f-0a5099d78d5c-host-slash\") pod \"iptables-alerter-459v4\" (UID: \"f8898642-a099-48c6-ba9f-0a5099d78d5c\") " pod="openshift-network-operator/iptables-alerter-459v4" Apr 20 07:50:05.234996 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.234963 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-run\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.235177 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.234995 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba7d0588-c3eb-4849-ae3b-630be7fcc621-os-release\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.235177 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.235036 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ba7d0588-c3eb-4849-ae3b-630be7fcc621-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.235177 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.235066 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39-serviceca\") pod \"node-ca-n9944\" (UID: \"b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39\") " pod="openshift-image-registry/node-ca-n9944" Apr 20 07:50:05.235177 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.235097 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76nct\" (UniqueName: \"kubernetes.io/projected/b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39-kube-api-access-76nct\") pod \"node-ca-n9944\" (UID: \"b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39\") " pod="openshift-image-registry/node-ca-n9944" Apr 20 07:50:05.235177 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.235128 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brwqt\" (UniqueName: \"kubernetes.io/projected/f8898642-a099-48c6-ba9f-0a5099d78d5c-kube-api-access-brwqt\") pod \"iptables-alerter-459v4\" (UID: \"f8898642-a099-48c6-ba9f-0a5099d78d5c\") " pod="openshift-network-operator/iptables-alerter-459v4" Apr 20 07:50:05.235177 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.235164 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-etc-modprobe-d\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.235546 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.235197 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-etc-sysctl-conf\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.235546 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.235232 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba7d0588-c3eb-4849-ae3b-630be7fcc621-cnibin\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.235546 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.235299 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba7d0588-c3eb-4849-ae3b-630be7fcc621-cni-binary-copy\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.235546 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.235335 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba7d0588-c3eb-4849-ae3b-630be7fcc621-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.235546 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.235368 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f8898642-a099-48c6-ba9f-0a5099d78d5c-iptables-alerter-script\") pod \"iptables-alerter-459v4\" (UID: \"f8898642-a099-48c6-ba9f-0a5099d78d5c\") " pod="openshift-network-operator/iptables-alerter-459v4" Apr 20 07:50:05.235546 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.235400 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nckkc\" (UniqueName: \"kubernetes.io/projected/46b8deca-b33b-45e1-9131-b47fde192a78-kube-api-access-nckkc\") pod \"network-metrics-daemon-7qpdh\" (UID: \"46b8deca-b33b-45e1-9131-b47fde192a78\") " pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:05.235546 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.235445 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-etc-sysconfig\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.235546 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.235480 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-etc-tuned\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.236022 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.235550 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-tmp\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.236022 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.235596 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/14afe6ef-f671-4736-ba3f-ac6236c30291-hosts-file\") pod \"node-resolver-z54dw\" (UID: \"14afe6ef-f671-4736-ba3f-ac6236c30291\") " pod="openshift-dns/node-resolver-z54dw" Apr 20 07:50:05.236022 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.235694 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-etc-kubernetes\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.236022 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.235805 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/14afe6ef-f671-4736-ba3f-ac6236c30291-tmp-dir\") pod \"node-resolver-z54dw\" (UID: \"14afe6ef-f671-4736-ba3f-ac6236c30291\") " pod="openshift-dns/node-resolver-z54dw" Apr 20 07:50:05.236022 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.235952 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gt6b\" (UniqueName: \"kubernetes.io/projected/ba7d0588-c3eb-4849-ae3b-630be7fcc621-kube-api-access-5gt6b\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.236022 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.235987 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs\") pod \"network-metrics-daemon-7qpdh\" (UID: \"46b8deca-b33b-45e1-9131-b47fde192a78\") " pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:05.236022 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.236013 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 07:50:05.236339 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.236032 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-sys\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.236339 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.236071 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-host\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.236339 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.236115 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba7d0588-c3eb-4849-ae3b-630be7fcc621-system-cni-dir\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.236339 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.236184 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-j887h\"" Apr 20 07:50:05.236523 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.236455 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.236849 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.236813 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.238731 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.238712 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-nhvx4\"" Apr 20 07:50:05.238887 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.238866 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 07:50:05.239525 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.239503 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:05.239601 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:05.239572 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sfl6t" podUID="ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06" Apr 20 07:50:05.239673 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.239642 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 07:50:05.239913 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.239890 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 07:50:05.240019 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.239955 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 07:50:05.240097 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.240075 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-5sfpv\"" Apr 20 07:50:05.268638 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.268029 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 07:45:04 +0000 UTC" deadline="2028-01-13 13:53:06.221905558 +0000 UTC" Apr 20 07:50:05.268638 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.268069 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15198h3m0.953840724s" Apr 20 07:50:05.290933 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.290794 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-24.ec2.internal" event={"ID":"5018c020cf992df6097e0aee42f16bf3","Type":"ContainerStarted","Data":"c5e310557dc89151ca48e89c67c3ad1f80fa6b026f9f17623a87ba6a24fd7990"} Apr 20 07:50:05.292186 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.292156 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-24.ec2.internal" event={"ID":"2a8554c547a781f4277e2efb4f0aafc4","Type":"ContainerStarted","Data":"b4e6e70722cb3e255551e8d102c2dd634c594912eeb929020f74c7e2a9688549"} Apr 20 07:50:05.323949 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.323919 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 07:50:05.336920 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.336887 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gt6b\" (UniqueName: \"kubernetes.io/projected/ba7d0588-c3eb-4849-ae3b-630be7fcc621-kube-api-access-5gt6b\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.337095 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.336936 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-host-cni-netd\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.337095 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.336968 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-env-overrides\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.337095 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.336991 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-system-cni-dir\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.337095 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337034 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-etc-kubernetes\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.337095 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337061 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9d731404-e3dc-45b5-a6c6-4d250a44a923-registration-dir\") pod \"aws-ebs-csi-driver-node-24jx4\" (UID: \"9d731404-e3dc-45b5-a6c6-4d250a44a923\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.337095 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337086 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9d731404-e3dc-45b5-a6c6-4d250a44a923-sys-fs\") pod \"aws-ebs-csi-driver-node-24jx4\" (UID: \"9d731404-e3dc-45b5-a6c6-4d250a44a923\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.337375 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337115 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-sys\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.337375 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337140 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba7d0588-c3eb-4849-ae3b-630be7fcc621-system-cni-dir\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.337375 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337184 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-var-lib-openvswitch\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.337375 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337209 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-host-cni-bin\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.337375 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337232 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1755049c-9fb4-42bf-8134-d41e0e7a4e97-konnectivity-ca\") pod \"konnectivity-agent-ggtd4\" (UID: \"1755049c-9fb4-42bf-8134-d41e0e7a4e97\") " pod="kube-system/konnectivity-agent-ggtd4" Apr 20 07:50:05.337375 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337251 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-sys\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.337375 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337272 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9d731404-e3dc-45b5-a6c6-4d250a44a923-socket-dir\") pod \"aws-ebs-csi-driver-node-24jx4\" (UID: \"9d731404-e3dc-45b5-a6c6-4d250a44a923\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.337375 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337317 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-etc-sysctl-d\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.337375 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337257 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba7d0588-c3eb-4849-ae3b-630be7fcc621-system-cni-dir\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.337375 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337355 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ba7d0588-c3eb-4849-ae3b-630be7fcc621-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.337868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337396 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-etc-openvswitch\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.337868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337424 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-host-run-k8s-cni-cncf-io\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.337868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9d731404-e3dc-45b5-a6c6-4d250a44a923-kubelet-dir\") pod \"aws-ebs-csi-driver-node-24jx4\" (UID: \"9d731404-e3dc-45b5-a6c6-4d250a44a923\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.337868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337487 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-etc-sysctl-d\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.337868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337477 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-etc-systemd\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.337868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337538 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-etc-systemd\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.337868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337543 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24kmc\" (UniqueName: \"kubernetes.io/projected/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-kube-api-access-24kmc\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.337868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337575 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-log-socket\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.337868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337603 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-ovnkube-script-lib\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.337868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337644 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-host-run-netns\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.337868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337669 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-run\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.337868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337695 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba7d0588-c3eb-4849-ae3b-630be7fcc621-os-release\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.337868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337723 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-cnibin\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.337868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337755 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9d731404-e3dc-45b5-a6c6-4d250a44a923-device-dir\") pod \"aws-ebs-csi-driver-node-24jx4\" (UID: \"9d731404-e3dc-45b5-a6c6-4d250a44a923\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.337868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337800 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85f66\" (UniqueName: \"kubernetes.io/projected/9d731404-e3dc-45b5-a6c6-4d250a44a923-kube-api-access-85f66\") pod \"aws-ebs-csi-driver-node-24jx4\" (UID: \"9d731404-e3dc-45b5-a6c6-4d250a44a923\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.337868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337833 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-run\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.337868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337836 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-etc-sysctl-conf\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.338601 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.337963 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-etc-sysctl-conf\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.338601 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338003 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba7d0588-c3eb-4849-ae3b-630be7fcc621-cni-binary-copy\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.338601 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba7d0588-c3eb-4849-ae3b-630be7fcc621-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.338601 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338060 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ba7d0588-c3eb-4849-ae3b-630be7fcc621-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.338601 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338122 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba7d0588-c3eb-4849-ae3b-630be7fcc621-os-release\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.338601 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338116 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f8898642-a099-48c6-ba9f-0a5099d78d5c-iptables-alerter-script\") pod \"iptables-alerter-459v4\" (UID: \"f8898642-a099-48c6-ba9f-0a5099d78d5c\") " pod="openshift-network-operator/iptables-alerter-459v4" Apr 20 07:50:05.338601 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338217 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba7d0588-c3eb-4849-ae3b-630be7fcc621-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.338601 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338224 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-host-run-netns\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.338601 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338276 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/04eada51-b824-41d7-8f99-553412d17053-multus-daemon-config\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.338601 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338307 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nckkc\" (UniqueName: \"kubernetes.io/projected/46b8deca-b33b-45e1-9131-b47fde192a78-kube-api-access-nckkc\") pod \"network-metrics-daemon-7qpdh\" (UID: \"46b8deca-b33b-45e1-9131-b47fde192a78\") " pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:05.338601 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338349 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-etc-sysconfig\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.338601 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338438 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-etc-tuned\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.338601 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338465 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-tmp\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.338601 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/14afe6ef-f671-4736-ba3f-ac6236c30291-hosts-file\") pod \"node-resolver-z54dw\" (UID: \"14afe6ef-f671-4736-ba3f-ac6236c30291\") " pod="openshift-dns/node-resolver-z54dw" Apr 20 07:50:05.338601 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338581 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-etc-sysconfig\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.338601 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338589 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04eada51-b824-41d7-8f99-553412d17053-cni-binary-copy\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.339342 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338606 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba7d0588-c3eb-4849-ae3b-630be7fcc621-cni-binary-copy\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.339342 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338699 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-multus-socket-dir-parent\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.339342 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338702 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f8898642-a099-48c6-ba9f-0a5099d78d5c-iptables-alerter-script\") pod \"iptables-alerter-459v4\" (UID: \"f8898642-a099-48c6-ba9f-0a5099d78d5c\") " pod="openshift-network-operator/iptables-alerter-459v4" Apr 20 07:50:05.339342 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338711 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/14afe6ef-f671-4736-ba3f-ac6236c30291-hosts-file\") pod \"node-resolver-z54dw\" (UID: \"14afe6ef-f671-4736-ba3f-ac6236c30291\") " pod="openshift-dns/node-resolver-z54dw" Apr 20 07:50:05.339342 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338730 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qprwh\" (UniqueName: \"kubernetes.io/projected/04eada51-b824-41d7-8f99-553412d17053-kube-api-access-qprwh\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.339342 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338783 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-etc-kubernetes\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.339342 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338810 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/14afe6ef-f671-4736-ba3f-ac6236c30291-tmp-dir\") pod \"node-resolver-z54dw\" (UID: \"14afe6ef-f671-4736-ba3f-ac6236c30291\") " pod="openshift-dns/node-resolver-z54dw" Apr 20 07:50:05.339342 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338827 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-etc-kubernetes\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.339342 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338846 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 07:50:05.339342 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338862 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-systemd-units\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.339342 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338888 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-run-openvswitch\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.339342 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338916 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-host-var-lib-cni-bin\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.339342 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338943 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9d731404-e3dc-45b5-a6c6-4d250a44a923-etc-selinux\") pod \"aws-ebs-csi-driver-node-24jx4\" (UID: \"9d731404-e3dc-45b5-a6c6-4d250a44a923\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.339342 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.338976 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs\") pod \"network-metrics-daemon-7qpdh\" (UID: \"46b8deca-b33b-45e1-9131-b47fde192a78\") " pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:05.339342 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.339055 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-host\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.339342 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.339085 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-ovnkube-config\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.339342 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.339109 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-multus-cni-dir\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.340157 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.339134 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-host-var-lib-cni-multus\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.340157 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.339162 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-multus-conf-dir\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.340157 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:05.339189 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:05.340157 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.339199 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-host-run-multus-certs\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.340157 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.339236 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-host\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.340157 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:05.339252 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs podName:46b8deca-b33b-45e1-9131-b47fde192a78 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:05.839233202 +0000 UTC m=+3.012560631 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs") pod "network-metrics-daemon-7qpdh" (UID: "46b8deca-b33b-45e1-9131-b47fde192a78") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:05.340157 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.339276 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nc7c\" (UniqueName: \"kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c\") pod \"network-check-target-sfl6t\" (UID: \"ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06\") " pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:05.340157 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.339307 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-lib-modules\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.340157 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.339189 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/14afe6ef-f671-4736-ba3f-ac6236c30291-tmp-dir\") pod \"node-resolver-z54dw\" (UID: \"14afe6ef-f671-4736-ba3f-ac6236c30291\") " pod="openshift-dns/node-resolver-z54dw" Apr 20 07:50:05.340157 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.339334 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbfr6\" (UniqueName: \"kubernetes.io/projected/14afe6ef-f671-4736-ba3f-ac6236c30291-kube-api-access-hbfr6\") pod \"node-resolver-z54dw\" (UID: \"14afe6ef-f671-4736-ba3f-ac6236c30291\") " pod="openshift-dns/node-resolver-z54dw" Apr 20 07:50:05.340157 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.339468 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-lib-modules\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.340157 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.339607 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39-host\") pod \"node-ca-n9944\" (UID: \"b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39\") " pod="openshift-image-registry/node-ca-n9944" Apr 20 07:50:05.340157 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.339360 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39-host\") pod \"node-ca-n9944\" (UID: \"b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39\") " pod="openshift-image-registry/node-ca-n9944" Apr 20 07:50:05.340157 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.339873 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-run-systemd\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.340157 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.339898 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-run-ovn\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.340157 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.339931 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-var-lib-kubelet\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.340157 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.339957 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f8898642-a099-48c6-ba9f-0a5099d78d5c-host-slash\") pod \"iptables-alerter-459v4\" (UID: \"f8898642-a099-48c6-ba9f-0a5099d78d5c\") " pod="openshift-network-operator/iptables-alerter-459v4" Apr 20 07:50:05.340973 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.339981 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ba7d0588-c3eb-4849-ae3b-630be7fcc621-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.340973 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.340008 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-host-kubelet\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.340973 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.340031 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-node-log\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.340973 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.340056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39-serviceca\") pod \"node-ca-n9944\" (UID: \"b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39\") " pod="openshift-image-registry/node-ca-n9944" Apr 20 07:50:05.340973 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.340080 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76nct\" (UniqueName: \"kubernetes.io/projected/b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39-kube-api-access-76nct\") pod \"node-ca-n9944\" (UID: \"b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39\") " pod="openshift-image-registry/node-ca-n9944" Apr 20 07:50:05.340973 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.340105 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brwqt\" (UniqueName: \"kubernetes.io/projected/f8898642-a099-48c6-ba9f-0a5099d78d5c-kube-api-access-brwqt\") pod \"iptables-alerter-459v4\" (UID: \"f8898642-a099-48c6-ba9f-0a5099d78d5c\") " pod="openshift-network-operator/iptables-alerter-459v4" Apr 20 07:50:05.340973 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.340133 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-host-var-lib-kubelet\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.340973 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.340159 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-etc-modprobe-d\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.340973 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.340183 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba7d0588-c3eb-4849-ae3b-630be7fcc621-cnibin\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.340973 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.340211 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-host-slash\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.340973 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.340234 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-host-run-ovn-kubernetes\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.340973 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.340265 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-ovn-node-metrics-cert\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.340973 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.340291 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-hostroot\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.340973 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.340316 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1755049c-9fb4-42bf-8134-d41e0e7a4e97-agent-certs\") pod \"konnectivity-agent-ggtd4\" (UID: \"1755049c-9fb4-42bf-8134-d41e0e7a4e97\") " pod="kube-system/konnectivity-agent-ggtd4" Apr 20 07:50:05.340973 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.340341 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j57md\" (UniqueName: \"kubernetes.io/projected/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-kube-api-access-j57md\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.340973 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.340367 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.340973 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.340391 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-os-release\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.341757 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.340531 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-var-lib-kubelet\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.341757 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.340588 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f8898642-a099-48c6-ba9f-0a5099d78d5c-host-slash\") pod \"iptables-alerter-459v4\" (UID: \"f8898642-a099-48c6-ba9f-0a5099d78d5c\") " pod="openshift-network-operator/iptables-alerter-459v4" Apr 20 07:50:05.341757 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.341056 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ba7d0588-c3eb-4849-ae3b-630be7fcc621-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.341757 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.341550 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39-serviceca\") pod \"node-ca-n9944\" (UID: \"b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39\") " pod="openshift-image-registry/node-ca-n9944" Apr 20 07:50:05.342092 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.342068 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-etc-modprobe-d\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.342092 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.342071 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba7d0588-c3eb-4849-ae3b-630be7fcc621-cnibin\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.342229 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.342138 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-tmp\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.342229 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.342175 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-etc-tuned\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.351420 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.351299 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76nct\" (UniqueName: \"kubernetes.io/projected/b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39-kube-api-access-76nct\") pod \"node-ca-n9944\" (UID: \"b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39\") " pod="openshift-image-registry/node-ca-n9944" Apr 20 07:50:05.353135 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.353106 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbfr6\" (UniqueName: \"kubernetes.io/projected/14afe6ef-f671-4736-ba3f-ac6236c30291-kube-api-access-hbfr6\") pod \"node-resolver-z54dw\" (UID: \"14afe6ef-f671-4736-ba3f-ac6236c30291\") " pod="openshift-dns/node-resolver-z54dw" Apr 20 07:50:05.353243 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.353107 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brwqt\" (UniqueName: \"kubernetes.io/projected/f8898642-a099-48c6-ba9f-0a5099d78d5c-kube-api-access-brwqt\") pod \"iptables-alerter-459v4\" (UID: \"f8898642-a099-48c6-ba9f-0a5099d78d5c\") " pod="openshift-network-operator/iptables-alerter-459v4" Apr 20 07:50:05.353243 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.353112 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gt6b\" (UniqueName: \"kubernetes.io/projected/ba7d0588-c3eb-4849-ae3b-630be7fcc621-kube-api-access-5gt6b\") pod \"multus-additional-cni-plugins-hzzk5\" (UID: \"ba7d0588-c3eb-4849-ae3b-630be7fcc621\") " pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.354593 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.354562 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nckkc\" (UniqueName: \"kubernetes.io/projected/46b8deca-b33b-45e1-9131-b47fde192a78-kube-api-access-nckkc\") pod \"network-metrics-daemon-7qpdh\" (UID: \"46b8deca-b33b-45e1-9131-b47fde192a78\") " pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:05.355105 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.355072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24kmc\" (UniqueName: \"kubernetes.io/projected/c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0-kube-api-access-24kmc\") pod \"tuned-gt9w2\" (UID: \"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0\") " pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.435252 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.435218 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:50:05.441416 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441385 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-host-cni-netd\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.441568 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441425 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-env-overrides\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.441568 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-system-cni-dir\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.441568 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441471 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-etc-kubernetes\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.441568 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441495 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-host-cni-netd\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.441568 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441511 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9d731404-e3dc-45b5-a6c6-4d250a44a923-registration-dir\") pod \"aws-ebs-csi-driver-node-24jx4\" (UID: \"9d731404-e3dc-45b5-a6c6-4d250a44a923\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.441568 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441538 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-system-cni-dir\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.441568 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441546 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9d731404-e3dc-45b5-a6c6-4d250a44a923-sys-fs\") pod \"aws-ebs-csi-driver-node-24jx4\" (UID: \"9d731404-e3dc-45b5-a6c6-4d250a44a923\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.441952 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441579 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-var-lib-openvswitch\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.441952 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441579 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-etc-kubernetes\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.441952 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441593 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9d731404-e3dc-45b5-a6c6-4d250a44a923-sys-fs\") pod \"aws-ebs-csi-driver-node-24jx4\" (UID: \"9d731404-e3dc-45b5-a6c6-4d250a44a923\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.441952 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441605 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-host-cni-bin\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.441952 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441663 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1755049c-9fb4-42bf-8134-d41e0e7a4e97-konnectivity-ca\") pod \"konnectivity-agent-ggtd4\" (UID: \"1755049c-9fb4-42bf-8134-d41e0e7a4e97\") " pod="kube-system/konnectivity-agent-ggtd4" Apr 20 07:50:05.441952 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441677 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-var-lib-openvswitch\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.441952 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441703 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9d731404-e3dc-45b5-a6c6-4d250a44a923-socket-dir\") pod \"aws-ebs-csi-driver-node-24jx4\" (UID: \"9d731404-e3dc-45b5-a6c6-4d250a44a923\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.441952 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441729 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-host-cni-bin\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.441952 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441788 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-etc-openvswitch\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.441952 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441785 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9d731404-e3dc-45b5-a6c6-4d250a44a923-registration-dir\") pod \"aws-ebs-csi-driver-node-24jx4\" (UID: \"9d731404-e3dc-45b5-a6c6-4d250a44a923\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.441952 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441824 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-etc-openvswitch\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.441952 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441862 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-host-run-k8s-cni-cncf-io\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.441952 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441893 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9d731404-e3dc-45b5-a6c6-4d250a44a923-socket-dir\") pod \"aws-ebs-csi-driver-node-24jx4\" (UID: \"9d731404-e3dc-45b5-a6c6-4d250a44a923\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.441952 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441902 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9d731404-e3dc-45b5-a6c6-4d250a44a923-kubelet-dir\") pod \"aws-ebs-csi-driver-node-24jx4\" (UID: \"9d731404-e3dc-45b5-a6c6-4d250a44a923\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.441952 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441943 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-host-run-k8s-cni-cncf-io\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.441952 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.441941 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-env-overrides\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.442702 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442032 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9d731404-e3dc-45b5-a6c6-4d250a44a923-kubelet-dir\") pod \"aws-ebs-csi-driver-node-24jx4\" (UID: \"9d731404-e3dc-45b5-a6c6-4d250a44a923\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.442702 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442080 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-log-socket\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.442702 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442116 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-ovnkube-script-lib\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.442702 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442142 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-host-run-netns\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.442702 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442189 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-cnibin\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.442702 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442271 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1755049c-9fb4-42bf-8134-d41e0e7a4e97-konnectivity-ca\") pod \"konnectivity-agent-ggtd4\" (UID: \"1755049c-9fb4-42bf-8134-d41e0e7a4e97\") " pod="kube-system/konnectivity-agent-ggtd4" Apr 20 07:50:05.442702 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442327 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-cnibin\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.442702 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9d731404-e3dc-45b5-a6c6-4d250a44a923-device-dir\") pod \"aws-ebs-csi-driver-node-24jx4\" (UID: \"9d731404-e3dc-45b5-a6c6-4d250a44a923\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.442702 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442368 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85f66\" (UniqueName: \"kubernetes.io/projected/9d731404-e3dc-45b5-a6c6-4d250a44a923-kube-api-access-85f66\") pod \"aws-ebs-csi-driver-node-24jx4\" (UID: \"9d731404-e3dc-45b5-a6c6-4d250a44a923\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.442702 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442409 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9d731404-e3dc-45b5-a6c6-4d250a44a923-device-dir\") pod \"aws-ebs-csi-driver-node-24jx4\" (UID: \"9d731404-e3dc-45b5-a6c6-4d250a44a923\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.442702 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442485 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-host-run-netns\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.442702 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442541 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-log-socket\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.442702 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442584 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-host-run-netns\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.442702 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442631 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/04eada51-b824-41d7-8f99-553412d17053-multus-daemon-config\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.442702 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442671 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04eada51-b824-41d7-8f99-553412d17053-cni-binary-copy\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.442702 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442678 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-ovnkube-script-lib\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.442702 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442681 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-host-run-netns\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.442702 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442709 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-multus-socket-dir-parent\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.443507 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qprwh\" (UniqueName: \"kubernetes.io/projected/04eada51-b824-41d7-8f99-553412d17053-kube-api-access-qprwh\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.443507 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442766 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-systemd-units\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.443507 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442792 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-run-openvswitch\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.443507 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442817 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-host-var-lib-cni-bin\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.443507 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9d731404-e3dc-45b5-a6c6-4d250a44a923-etc-selinux\") pod \"aws-ebs-csi-driver-node-24jx4\" (UID: \"9d731404-e3dc-45b5-a6c6-4d250a44a923\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.443507 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442879 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-ovnkube-config\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.443507 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442903 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-multus-cni-dir\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.443507 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442926 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-host-var-lib-cni-multus\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.443507 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442957 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-multus-conf-dir\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.443507 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.442982 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-host-run-multus-certs\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.443507 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443008 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nc7c\" (UniqueName: \"kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c\") pod \"network-check-target-sfl6t\" (UID: \"ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06\") " pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:05.443507 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-run-systemd\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.443507 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443061 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-run-ovn\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.443507 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443087 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-host-kubelet\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.443507 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443112 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-node-log\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.443507 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443132 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/04eada51-b824-41d7-8f99-553412d17053-multus-daemon-config\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.443507 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443139 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-host-var-lib-kubelet\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.443507 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443176 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-host-slash\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.444283 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443194 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/04eada51-b824-41d7-8f99-553412d17053-cni-binary-copy\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.444283 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443243 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-host-var-lib-kubelet\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.444283 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443247 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-host-var-lib-cni-bin\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.444283 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443265 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9d731404-e3dc-45b5-a6c6-4d250a44a923-etc-selinux\") pod \"aws-ebs-csi-driver-node-24jx4\" (UID: \"9d731404-e3dc-45b5-a6c6-4d250a44a923\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.444283 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443280 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-run-ovn\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.444283 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443278 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-host-var-lib-cni-multus\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.444283 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443301 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-host-kubelet\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.444283 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443306 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-host-run-ovn-kubernetes\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.444283 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443322 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-multus-socket-dir-parent\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.444283 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443323 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-ovn-node-metrics-cert\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.444283 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443332 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-host-run-multus-certs\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.444283 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443481 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-hostroot\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.444283 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443509 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1755049c-9fb4-42bf-8134-d41e0e7a4e97-agent-certs\") pod \"konnectivity-agent-ggtd4\" (UID: \"1755049c-9fb4-42bf-8134-d41e0e7a4e97\") " pod="kube-system/konnectivity-agent-ggtd4" Apr 20 07:50:05.444283 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443536 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j57md\" (UniqueName: \"kubernetes.io/projected/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-kube-api-access-j57md\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.444283 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443564 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.444283 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443584 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-node-log\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.444283 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443592 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-os-release\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.444283 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443664 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-hostroot\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.444939 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443672 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-multus-conf-dir\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.444939 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443700 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-systemd-units\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.444939 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443719 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.444939 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443754 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-host-slash\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.444939 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443807 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-os-release\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.444939 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443844 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-host-run-ovn-kubernetes\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.444939 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443876 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-run-systemd\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.444939 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443901 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-run-openvswitch\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.444939 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.443943 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/04eada51-b824-41d7-8f99-553412d17053-multus-cni-dir\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.444939 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.444144 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-ovnkube-config\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.445937 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.445914 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-ovn-node-metrics-cert\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.446280 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.446259 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1755049c-9fb4-42bf-8134-d41e0e7a4e97-agent-certs\") pod \"konnectivity-agent-ggtd4\" (UID: \"1755049c-9fb4-42bf-8134-d41e0e7a4e97\") " pod="kube-system/konnectivity-agent-ggtd4" Apr 20 07:50:05.448379 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:05.448361 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:50:05.448379 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:05.448382 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:50:05.448548 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:05.448395 2575 projected.go:194] Error preparing data for projected volume kube-api-access-8nc7c for pod openshift-network-diagnostics/network-check-target-sfl6t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:05.448548 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:05.448457 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c podName:ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:05.948438126 +0000 UTC m=+3.121765552 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8nc7c" (UniqueName: "kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c") pod "network-check-target-sfl6t" (UID: "ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:05.450407 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.450355 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85f66\" (UniqueName: \"kubernetes.io/projected/9d731404-e3dc-45b5-a6c6-4d250a44a923-kube-api-access-85f66\") pod \"aws-ebs-csi-driver-node-24jx4\" (UID: \"9d731404-e3dc-45b5-a6c6-4d250a44a923\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.450831 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.450811 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qprwh\" (UniqueName: \"kubernetes.io/projected/04eada51-b824-41d7-8f99-553412d17053-kube-api-access-qprwh\") pod \"multus-7r2r4\" (UID: \"04eada51-b824-41d7-8f99-553412d17053\") " pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.451071 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.451055 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j57md\" (UniqueName: \"kubernetes.io/projected/f3a24a42-45b7-4726-bc7c-32d9f9d61eaf-kube-api-access-j57md\") pod \"ovnkube-node-crn8d\" (UID: \"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf\") " pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.539156 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.539125 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" Apr 20 07:50:05.547020 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.546990 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-z54dw" Apr 20 07:50:05.556706 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.556681 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n9944" Apr 20 07:50:05.561396 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.561373 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hzzk5" Apr 20 07:50:05.570029 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.570004 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-459v4" Apr 20 07:50:05.579779 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.579756 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:05.588479 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.588453 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ggtd4" Apr 20 07:50:05.598227 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.598202 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" Apr 20 07:50:05.604961 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.604937 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7r2r4" Apr 20 07:50:05.847101 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:05.847020 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs\") pod \"network-metrics-daemon-7qpdh\" (UID: \"46b8deca-b33b-45e1-9131-b47fde192a78\") " pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:05.847260 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:05.847145 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:05.847260 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:05.847215 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs podName:46b8deca-b33b-45e1-9131-b47fde192a78 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:06.84719841 +0000 UTC m=+4.020525817 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs") pod "network-metrics-daemon-7qpdh" (UID: "46b8deca-b33b-45e1-9131-b47fde192a78") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:06.048239 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:06.048204 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nc7c\" (UniqueName: \"kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c\") pod \"network-check-target-sfl6t\" (UID: \"ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06\") " pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:06.048385 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:06.048332 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:50:06.048385 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:06.048347 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:50:06.048385 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:06.048359 2575 projected.go:194] Error preparing data for projected volume kube-api-access-8nc7c for pod openshift-network-diagnostics/network-check-target-sfl6t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:06.048523 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:06.048410 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c podName:ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:07.048392912 +0000 UTC m=+4.221720319 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8nc7c" (UniqueName: "kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c") pod "network-check-target-sfl6t" (UID: "ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:06.215639 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:06.215397 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d731404_e3dc_45b5_a6c6_4d250a44a923.slice/crio-68d359adbd004b82f833407742bd143e952923eb94e9374ebcb39dfe26321f5a WatchSource:0}: Error finding container 68d359adbd004b82f833407742bd143e952923eb94e9374ebcb39dfe26321f5a: Status 404 returned error can't find the container with id 68d359adbd004b82f833407742bd143e952923eb94e9374ebcb39dfe26321f5a Apr 20 07:50:06.217125 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:06.217097 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3a24a42_45b7_4726_bc7c_32d9f9d61eaf.slice/crio-07161ad075c08f418d6283fa5e8c949526e7f9b39707693a5ba0cfbfc47f8c71 WatchSource:0}: Error finding container 07161ad075c08f418d6283fa5e8c949526e7f9b39707693a5ba0cfbfc47f8c71: Status 404 returned error can't find the container with id 07161ad075c08f418d6283fa5e8c949526e7f9b39707693a5ba0cfbfc47f8c71 Apr 20 07:50:06.217714 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:06.217659 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1755049c_9fb4_42bf_8134_d41e0e7a4e97.slice/crio-a315db38209af93849f2c2f69817c69a3bcfef75f808463bd09e1cc576a3540a WatchSource:0}: Error finding container a315db38209af93849f2c2f69817c69a3bcfef75f808463bd09e1cc576a3540a: Status 404 returned error can't find the container with id a315db38209af93849f2c2f69817c69a3bcfef75f808463bd09e1cc576a3540a Apr 20 07:50:06.220784 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:06.220712 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7ed9bc9_8f0a_40e4_bd11_25b4f1c3cd39.slice/crio-994baecf5fba60f40142a2c9320925b04082c11267a172779b78438b396c6899 WatchSource:0}: Error finding container 994baecf5fba60f40142a2c9320925b04082c11267a172779b78438b396c6899: Status 404 returned error can't find the container with id 994baecf5fba60f40142a2c9320925b04082c11267a172779b78438b396c6899 Apr 20 07:50:06.268863 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:06.268826 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 07:45:04 +0000 UTC" deadline="2027-10-14 21:58:07.35420836 +0000 UTC" Apr 20 07:50:06.268863 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:06.268862 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13022h8m1.085350052s" Apr 20 07:50:06.294390 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:06.294351 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" event={"ID":"9d731404-e3dc-45b5-a6c6-4d250a44a923","Type":"ContainerStarted","Data":"68d359adbd004b82f833407742bd143e952923eb94e9374ebcb39dfe26321f5a"} Apr 20 07:50:06.295289 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:06.295265 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-z54dw" event={"ID":"14afe6ef-f671-4736-ba3f-ac6236c30291","Type":"ContainerStarted","Data":"8502d922521412393c4ccafdf7e2efce7a4f55d173e1ac4c06abb8944720df0c"} Apr 20 07:50:06.296215 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:06.296192 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7r2r4" event={"ID":"04eada51-b824-41d7-8f99-553412d17053","Type":"ContainerStarted","Data":"469c6dae51416fa8a13f251dde3256bb28282ec637447c8f5c9a9694c1389f50"} Apr 20 07:50:06.297075 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:06.297045 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" event={"ID":"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0","Type":"ContainerStarted","Data":"410a57b05ed32a152745e6295c596c8a6d8abcbbb636f95cf318a1ecf2a29aeb"} Apr 20 07:50:06.298096 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:06.297896 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-459v4" event={"ID":"f8898642-a099-48c6-ba9f-0a5099d78d5c","Type":"ContainerStarted","Data":"e41bbba5735ab7b8948612acb3ed2756489f1bd3586a34e52fd2503f9be74c6c"} Apr 20 07:50:06.298900 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:06.298876 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n9944" event={"ID":"b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39","Type":"ContainerStarted","Data":"994baecf5fba60f40142a2c9320925b04082c11267a172779b78438b396c6899"} Apr 20 07:50:06.299887 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:06.299867 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ggtd4" event={"ID":"1755049c-9fb4-42bf-8134-d41e0e7a4e97","Type":"ContainerStarted","Data":"a315db38209af93849f2c2f69817c69a3bcfef75f808463bd09e1cc576a3540a"} Apr 20 07:50:06.300760 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:06.300735 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzzk5" event={"ID":"ba7d0588-c3eb-4849-ae3b-630be7fcc621","Type":"ContainerStarted","Data":"c2c2e4f26637e2c1cc2bb75049ea0c1dc0975ef1c467f7319c76ed415733d570"} Apr 20 07:50:06.301626 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:06.301593 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" event={"ID":"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf","Type":"ContainerStarted","Data":"07161ad075c08f418d6283fa5e8c949526e7f9b39707693a5ba0cfbfc47f8c71"} Apr 20 07:50:06.856178 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:06.855560 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs\") pod \"network-metrics-daemon-7qpdh\" (UID: \"46b8deca-b33b-45e1-9131-b47fde192a78\") " pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:06.856178 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:06.855774 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:06.856178 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:06.855835 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs podName:46b8deca-b33b-45e1-9131-b47fde192a78 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:08.855814673 +0000 UTC m=+6.029142085 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs") pod "network-metrics-daemon-7qpdh" (UID: "46b8deca-b33b-45e1-9131-b47fde192a78") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:07.058177 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:07.058074 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nc7c\" (UniqueName: \"kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c\") pod \"network-check-target-sfl6t\" (UID: \"ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06\") " pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:07.058336 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:07.058262 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:50:07.058336 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:07.058281 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:50:07.058336 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:07.058294 2575 projected.go:194] Error preparing data for projected volume kube-api-access-8nc7c for pod openshift-network-diagnostics/network-check-target-sfl6t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:07.058506 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:07.058352 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c podName:ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:09.058334544 +0000 UTC m=+6.231661957 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-8nc7c" (UniqueName: "kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c") pod "network-check-target-sfl6t" (UID: "ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:07.287249 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:07.286764 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:07.287249 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:07.286893 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sfl6t" podUID="ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06" Apr 20 07:50:07.287249 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:07.286943 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:07.287249 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:07.287075 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qpdh" podUID="46b8deca-b33b-45e1-9131-b47fde192a78" Apr 20 07:50:07.316918 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:07.316825 2575 generic.go:358] "Generic (PLEG): container finished" podID="2a8554c547a781f4277e2efb4f0aafc4" containerID="156a7cee42f6d5a21ee7146bf9e0bc99c9b2da3aee31d8cfd0968d3cd4f850b9" exitCode=0 Apr 20 07:50:07.317416 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:07.317387 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-24.ec2.internal" event={"ID":"2a8554c547a781f4277e2efb4f0aafc4","Type":"ContainerDied","Data":"156a7cee42f6d5a21ee7146bf9e0bc99c9b2da3aee31d8cfd0968d3cd4f850b9"} Apr 20 07:50:07.337483 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:07.337449 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-24.ec2.internal" event={"ID":"5018c020cf992df6097e0aee42f16bf3","Type":"ContainerStarted","Data":"c84ff16eb4a344f8724dd5f0abd05ed0a2935f308acf3c2ebd5c3410c41bc247"} Apr 20 07:50:07.465915 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:07.464033 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-24.ec2.internal" podStartSLOduration=3.464010694 podStartE2EDuration="3.464010694s" podCreationTimestamp="2026-04-20 07:50:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:50:07.351634097 +0000 UTC m=+4.524961523" watchObservedRunningTime="2026-04-20 07:50:07.464010694 +0000 UTC m=+4.637338125" Apr 20 07:50:07.465915 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:07.465085 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-nv6pl"] Apr 20 07:50:07.468674 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:07.468211 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:07.468674 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:07.468287 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv6pl" podUID="036e621d-2979-4552-ac5c-6fab5743df3a" Apr 20 07:50:07.561487 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:07.561439 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/036e621d-2979-4552-ac5c-6fab5743df3a-kubelet-config\") pod \"global-pull-secret-syncer-nv6pl\" (UID: \"036e621d-2979-4552-ac5c-6fab5743df3a\") " pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:07.561487 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:07.561491 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/036e621d-2979-4552-ac5c-6fab5743df3a-dbus\") pod \"global-pull-secret-syncer-nv6pl\" (UID: \"036e621d-2979-4552-ac5c-6fab5743df3a\") " pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:07.561738 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:07.561551 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/036e621d-2979-4552-ac5c-6fab5743df3a-original-pull-secret\") pod \"global-pull-secret-syncer-nv6pl\" (UID: \"036e621d-2979-4552-ac5c-6fab5743df3a\") " pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:07.662137 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:07.662102 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/036e621d-2979-4552-ac5c-6fab5743df3a-original-pull-secret\") pod \"global-pull-secret-syncer-nv6pl\" (UID: \"036e621d-2979-4552-ac5c-6fab5743df3a\") " pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:07.662280 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:07.662194 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/036e621d-2979-4552-ac5c-6fab5743df3a-kubelet-config\") pod \"global-pull-secret-syncer-nv6pl\" (UID: \"036e621d-2979-4552-ac5c-6fab5743df3a\") " pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:07.662280 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:07.662221 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/036e621d-2979-4552-ac5c-6fab5743df3a-dbus\") pod \"global-pull-secret-syncer-nv6pl\" (UID: \"036e621d-2979-4552-ac5c-6fab5743df3a\") " pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:07.662440 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:07.662415 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/036e621d-2979-4552-ac5c-6fab5743df3a-dbus\") pod \"global-pull-secret-syncer-nv6pl\" (UID: \"036e621d-2979-4552-ac5c-6fab5743df3a\") " pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:07.662549 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:07.662531 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:07.662604 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:07.662591 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/036e621d-2979-4552-ac5c-6fab5743df3a-original-pull-secret podName:036e621d-2979-4552-ac5c-6fab5743df3a nodeName:}" failed. No retries permitted until 2026-04-20 07:50:08.162573004 +0000 UTC m=+5.335900417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/036e621d-2979-4552-ac5c-6fab5743df3a-original-pull-secret") pod "global-pull-secret-syncer-nv6pl" (UID: "036e621d-2979-4552-ac5c-6fab5743df3a") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:07.662952 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:07.662858 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/036e621d-2979-4552-ac5c-6fab5743df3a-kubelet-config\") pod \"global-pull-secret-syncer-nv6pl\" (UID: \"036e621d-2979-4552-ac5c-6fab5743df3a\") " pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:08.165898 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:08.165860 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/036e621d-2979-4552-ac5c-6fab5743df3a-original-pull-secret\") pod \"global-pull-secret-syncer-nv6pl\" (UID: \"036e621d-2979-4552-ac5c-6fab5743df3a\") " pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:08.166097 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:08.166076 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:08.166171 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:08.166142 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/036e621d-2979-4552-ac5c-6fab5743df3a-original-pull-secret podName:036e621d-2979-4552-ac5c-6fab5743df3a nodeName:}" failed. No retries permitted until 2026-04-20 07:50:09.166124767 +0000 UTC m=+6.339452183 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/036e621d-2979-4552-ac5c-6fab5743df3a-original-pull-secret") pod "global-pull-secret-syncer-nv6pl" (UID: "036e621d-2979-4552-ac5c-6fab5743df3a") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:08.382019 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:08.381940 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-24.ec2.internal" event={"ID":"2a8554c547a781f4277e2efb4f0aafc4","Type":"ContainerStarted","Data":"6d0bd1a4813f9eb72eb68d5d627cf1f50d5d434fb12796ebac554b5f07ec278d"} Apr 20 07:50:08.872812 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:08.872774 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs\") pod \"network-metrics-daemon-7qpdh\" (UID: \"46b8deca-b33b-45e1-9131-b47fde192a78\") " pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:08.873035 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:08.873008 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:08.873109 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:08.873076 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs podName:46b8deca-b33b-45e1-9131-b47fde192a78 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:12.873057578 +0000 UTC m=+10.046384993 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs") pod "network-metrics-daemon-7qpdh" (UID: "46b8deca-b33b-45e1-9131-b47fde192a78") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:09.074607 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:09.073993 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nc7c\" (UniqueName: \"kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c\") pod \"network-check-target-sfl6t\" (UID: \"ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06\") " pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:09.074607 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:09.074144 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:50:09.074607 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:09.074163 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:50:09.074607 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:09.074176 2575 projected.go:194] Error preparing data for projected volume kube-api-access-8nc7c for pod openshift-network-diagnostics/network-check-target-sfl6t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:09.074607 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:09.074234 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c podName:ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:13.074215172 +0000 UTC m=+10.247542582 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-8nc7c" (UniqueName: "kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c") pod "network-check-target-sfl6t" (UID: "ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:09.175296 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:09.174594 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/036e621d-2979-4552-ac5c-6fab5743df3a-original-pull-secret\") pod \"global-pull-secret-syncer-nv6pl\" (UID: \"036e621d-2979-4552-ac5c-6fab5743df3a\") " pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:09.175296 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:09.174809 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:09.175296 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:09.174885 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/036e621d-2979-4552-ac5c-6fab5743df3a-original-pull-secret podName:036e621d-2979-4552-ac5c-6fab5743df3a nodeName:}" failed. No retries permitted until 2026-04-20 07:50:11.174863862 +0000 UTC m=+8.348191278 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/036e621d-2979-4552-ac5c-6fab5743df3a-original-pull-secret") pod "global-pull-secret-syncer-nv6pl" (UID: "036e621d-2979-4552-ac5c-6fab5743df3a") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:09.286532 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:09.286497 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:09.286707 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:09.286665 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:09.286707 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:09.286693 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sfl6t" podUID="ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06" Apr 20 07:50:09.286844 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:09.286787 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qpdh" podUID="46b8deca-b33b-45e1-9131-b47fde192a78" Apr 20 07:50:09.286844 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:09.286815 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:09.287005 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:09.286879 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv6pl" podUID="036e621d-2979-4552-ac5c-6fab5743df3a" Apr 20 07:50:11.194359 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:11.194267 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/036e621d-2979-4552-ac5c-6fab5743df3a-original-pull-secret\") pod \"global-pull-secret-syncer-nv6pl\" (UID: \"036e621d-2979-4552-ac5c-6fab5743df3a\") " pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:11.194859 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:11.194450 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:11.194859 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:11.194530 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/036e621d-2979-4552-ac5c-6fab5743df3a-original-pull-secret podName:036e621d-2979-4552-ac5c-6fab5743df3a nodeName:}" failed. No retries permitted until 2026-04-20 07:50:15.194510312 +0000 UTC m=+12.367837722 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/036e621d-2979-4552-ac5c-6fab5743df3a-original-pull-secret") pod "global-pull-secret-syncer-nv6pl" (UID: "036e621d-2979-4552-ac5c-6fab5743df3a") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:11.286708 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:11.286298 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:11.286708 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:11.286303 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:11.286708 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:11.286423 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sfl6t" podUID="ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06" Apr 20 07:50:11.286708 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:11.286488 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv6pl" podUID="036e621d-2979-4552-ac5c-6fab5743df3a" Apr 20 07:50:11.286708 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:11.286533 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:11.286708 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:11.286629 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qpdh" podUID="46b8deca-b33b-45e1-9131-b47fde192a78" Apr 20 07:50:12.909032 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:12.908417 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs\") pod \"network-metrics-daemon-7qpdh\" (UID: \"46b8deca-b33b-45e1-9131-b47fde192a78\") " pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:12.909032 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:12.908585 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:12.909032 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:12.908681 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs podName:46b8deca-b33b-45e1-9131-b47fde192a78 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:20.90865939 +0000 UTC m=+18.081986802 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs") pod "network-metrics-daemon-7qpdh" (UID: "46b8deca-b33b-45e1-9131-b47fde192a78") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:13.111014 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:13.110977 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nc7c\" (UniqueName: \"kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c\") pod \"network-check-target-sfl6t\" (UID: \"ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06\") " pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:13.111200 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:13.111159 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:50:13.111200 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:13.111183 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:50:13.111200 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:13.111198 2575 projected.go:194] Error preparing data for projected volume kube-api-access-8nc7c for pod openshift-network-diagnostics/network-check-target-sfl6t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:13.111348 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:13.111266 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c podName:ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:21.111245159 +0000 UTC m=+18.284572571 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-8nc7c" (UniqueName: "kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c") pod "network-check-target-sfl6t" (UID: "ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:13.287319 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:13.286830 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:13.287319 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:13.286836 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:13.287319 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:13.286950 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:13.287319 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:13.286945 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv6pl" podUID="036e621d-2979-4552-ac5c-6fab5743df3a" Apr 20 07:50:13.287319 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:13.287010 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qpdh" podUID="46b8deca-b33b-45e1-9131-b47fde192a78" Apr 20 07:50:13.287319 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:13.287106 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sfl6t" podUID="ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06" Apr 20 07:50:15.226910 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:15.226883 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/036e621d-2979-4552-ac5c-6fab5743df3a-original-pull-secret\") pod \"global-pull-secret-syncer-nv6pl\" (UID: \"036e621d-2979-4552-ac5c-6fab5743df3a\") " pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:15.227207 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:15.227042 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:15.227207 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:15.227119 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/036e621d-2979-4552-ac5c-6fab5743df3a-original-pull-secret podName:036e621d-2979-4552-ac5c-6fab5743df3a nodeName:}" failed. No retries permitted until 2026-04-20 07:50:23.227098785 +0000 UTC m=+20.400426233 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/036e621d-2979-4552-ac5c-6fab5743df3a-original-pull-secret") pod "global-pull-secret-syncer-nv6pl" (UID: "036e621d-2979-4552-ac5c-6fab5743df3a") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:15.286575 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:15.286541 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:15.286733 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:15.286593 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:15.286733 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:15.286676 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sfl6t" podUID="ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06" Apr 20 07:50:15.286733 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:15.286542 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:15.286958 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:15.286805 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qpdh" podUID="46b8deca-b33b-45e1-9131-b47fde192a78" Apr 20 07:50:15.286958 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:15.286902 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv6pl" podUID="036e621d-2979-4552-ac5c-6fab5743df3a" Apr 20 07:50:17.286463 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:17.286422 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:17.286463 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:17.286447 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:17.286997 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:17.286422 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:17.286997 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:17.286565 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qpdh" podUID="46b8deca-b33b-45e1-9131-b47fde192a78" Apr 20 07:50:17.286997 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:17.286676 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sfl6t" podUID="ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06" Apr 20 07:50:17.286997 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:17.286773 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv6pl" podUID="036e621d-2979-4552-ac5c-6fab5743df3a" Apr 20 07:50:19.286631 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:19.286587 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:19.287055 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:19.286594 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:19.287055 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:19.286702 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sfl6t" podUID="ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06" Apr 20 07:50:19.287055 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:19.286816 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qpdh" podUID="46b8deca-b33b-45e1-9131-b47fde192a78" Apr 20 07:50:19.287055 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:19.286594 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:19.287055 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:19.286950 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv6pl" podUID="036e621d-2979-4552-ac5c-6fab5743df3a" Apr 20 07:50:20.971722 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:20.971676 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs\") pod \"network-metrics-daemon-7qpdh\" (UID: \"46b8deca-b33b-45e1-9131-b47fde192a78\") " pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:20.972191 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:20.971804 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:20.972191 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:20.971865 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs podName:46b8deca-b33b-45e1-9131-b47fde192a78 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:36.971846968 +0000 UTC m=+34.145174386 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs") pod "network-metrics-daemon-7qpdh" (UID: "46b8deca-b33b-45e1-9131-b47fde192a78") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:21.173221 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:21.173180 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nc7c\" (UniqueName: \"kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c\") pod \"network-check-target-sfl6t\" (UID: \"ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06\") " pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:21.173394 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:21.173379 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:50:21.173435 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:21.173403 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:50:21.173435 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:21.173414 2575 projected.go:194] Error preparing data for projected volume kube-api-access-8nc7c for pod openshift-network-diagnostics/network-check-target-sfl6t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:21.173521 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:21.173472 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c podName:ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:37.173454369 +0000 UTC m=+34.346781779 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-8nc7c" (UniqueName: "kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c") pod "network-check-target-sfl6t" (UID: "ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:21.285879 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:21.285787 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:21.285879 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:21.285861 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:21.286170 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:21.285898 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:21.286170 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:21.285982 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv6pl" podUID="036e621d-2979-4552-ac5c-6fab5743df3a" Apr 20 07:50:21.286381 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:21.286352 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sfl6t" podUID="ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06" Apr 20 07:50:21.286491 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:21.286467 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qpdh" podUID="46b8deca-b33b-45e1-9131-b47fde192a78" Apr 20 07:50:23.286809 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:23.286748 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:23.287187 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:23.286838 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:23.287187 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:23.286843 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sfl6t" podUID="ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06" Apr 20 07:50:23.287187 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:23.286860 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:23.287187 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:23.286935 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qpdh" podUID="46b8deca-b33b-45e1-9131-b47fde192a78" Apr 20 07:50:23.287187 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:23.286985 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv6pl" podUID="036e621d-2979-4552-ac5c-6fab5743df3a" Apr 20 07:50:23.289300 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:23.289280 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/036e621d-2979-4552-ac5c-6fab5743df3a-original-pull-secret\") pod \"global-pull-secret-syncer-nv6pl\" (UID: \"036e621d-2979-4552-ac5c-6fab5743df3a\") " pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:23.289482 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:23.289468 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:23.289543 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:23.289533 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/036e621d-2979-4552-ac5c-6fab5743df3a-original-pull-secret podName:036e621d-2979-4552-ac5c-6fab5743df3a nodeName:}" failed. No retries permitted until 2026-04-20 07:50:39.289517863 +0000 UTC m=+36.462845283 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/036e621d-2979-4552-ac5c-6fab5743df3a-original-pull-secret") pod "global-pull-secret-syncer-nv6pl" (UID: "036e621d-2979-4552-ac5c-6fab5743df3a") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:50:24.410790 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:24.410318 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7r2r4" event={"ID":"04eada51-b824-41d7-8f99-553412d17053","Type":"ContainerStarted","Data":"8e06df043dbd7c3348c938cbfb663818054df5bde00d1df6acd91222c9d4a2e1"} Apr 20 07:50:24.411784 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:24.411756 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" event={"ID":"c1bad8b1-05dc-4cf2-a2b4-6354840ca7a0","Type":"ContainerStarted","Data":"03bf08e3882bdbacbf00250a29679e92c82c500b6ef5d8070f1d0512de0f8e17"} Apr 20 07:50:24.413125 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:24.413096 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n9944" event={"ID":"b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39","Type":"ContainerStarted","Data":"680123a68a98397aeee93819e705c621b810e8edee6b5e6278122514b674029d"} Apr 20 07:50:24.414506 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:24.414479 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ggtd4" event={"ID":"1755049c-9fb4-42bf-8134-d41e0e7a4e97","Type":"ContainerStarted","Data":"f62eefd34cb2df3e51df6848f971f8001f23e787a30eb4431090483c22193505"} Apr 20 07:50:24.416016 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:24.415993 2575 generic.go:358] "Generic (PLEG): container finished" podID="ba7d0588-c3eb-4849-ae3b-630be7fcc621" containerID="9990f869703afd38fadbc1df2880a184e784fc2eee8a7304c3598d33a71c79f8" exitCode=0 Apr 20 07:50:24.416120 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:24.416077 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzzk5" event={"ID":"ba7d0588-c3eb-4849-ae3b-630be7fcc621","Type":"ContainerDied","Data":"9990f869703afd38fadbc1df2880a184e784fc2eee8a7304c3598d33a71c79f8"} Apr 20 07:50:24.418949 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:24.418931 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovn-acl-logging/0.log" Apr 20 07:50:24.419296 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:24.419275 2575 generic.go:358] "Generic (PLEG): container finished" podID="f3a24a42-45b7-4726-bc7c-32d9f9d61eaf" containerID="a21bfa697ae927213bdc4573ac0db294f0e0d6e0dd7f616a2c88b5c71765883f" exitCode=1 Apr 20 07:50:24.419389 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:24.419298 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" event={"ID":"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf","Type":"ContainerStarted","Data":"afc951fe2bf90742ff1388324a7db5e7e991cbd67f98bc013a502a2c06978747"} Apr 20 07:50:24.419389 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:24.419324 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" event={"ID":"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf","Type":"ContainerStarted","Data":"159634fb248ac943f70f761e35fe23a919562afb050d4469f33254317a5e99b9"} Apr 20 07:50:24.419389 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:24.419338 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" event={"ID":"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf","Type":"ContainerStarted","Data":"9bed9fd5a6f482578083916f84b66692ca8ef7507dd3498dbb8d966e40b4bd1b"} Apr 20 07:50:24.419389 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:24.419351 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" event={"ID":"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf","Type":"ContainerStarted","Data":"b20a5797bfcb313f470af22ba3065c6522c17fafc12eb3dc933a6b40b2dc2ddc"} Apr 20 07:50:24.419389 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:24.419368 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" event={"ID":"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf","Type":"ContainerDied","Data":"a21bfa697ae927213bdc4573ac0db294f0e0d6e0dd7f616a2c88b5c71765883f"} Apr 20 07:50:24.419389 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:24.419384 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" event={"ID":"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf","Type":"ContainerStarted","Data":"7c40dec907c9a68afdfc056e5b4620d1798496ef84efc0a75b90910da1441f4b"} Apr 20 07:50:24.420780 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:24.420756 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" event={"ID":"9d731404-e3dc-45b5-a6c6-4d250a44a923","Type":"ContainerStarted","Data":"586e032c8ce09a4120e205891f479b81355790865ce419365bae30d3e8664c99"} Apr 20 07:50:24.422133 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:24.422112 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-z54dw" event={"ID":"14afe6ef-f671-4736-ba3f-ac6236c30291","Type":"ContainerStarted","Data":"313d6d5df0551a95e6cdd5407d15be0ee84f47e850ba7267a9a6376d2d2bd187"} Apr 20 07:50:24.425133 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:24.425092 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-24.ec2.internal" podStartSLOduration=20.425058843 podStartE2EDuration="20.425058843s" podCreationTimestamp="2026-04-20 07:50:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:50:08.395624578 +0000 UTC m=+5.568952002" watchObservedRunningTime="2026-04-20 07:50:24.425058843 +0000 UTC m=+21.598386273" Apr 20 07:50:24.425741 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:24.425706 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7r2r4" podStartSLOduration=4.33432696 podStartE2EDuration="21.425697197s" podCreationTimestamp="2026-04-20 07:50:03 +0000 UTC" firstStartedPulling="2026-04-20 07:50:06.249795555 +0000 UTC m=+3.423122966" lastFinishedPulling="2026-04-20 07:50:23.341165782 +0000 UTC m=+20.514493203" observedRunningTime="2026-04-20 07:50:24.425000213 +0000 UTC m=+21.598327641" watchObservedRunningTime="2026-04-20 07:50:24.425697197 +0000 UTC m=+21.599024627" Apr 20 07:50:24.436183 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:24.436128 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-n9944" podStartSLOduration=9.062141181 podStartE2EDuration="21.436112551s" podCreationTimestamp="2026-04-20 07:50:03 +0000 UTC" firstStartedPulling="2026-04-20 07:50:06.224145888 +0000 UTC m=+3.397473300" lastFinishedPulling="2026-04-20 07:50:18.598117246 +0000 UTC m=+15.771444670" observedRunningTime="2026-04-20 07:50:24.435458405 +0000 UTC m=+21.608785834" watchObservedRunningTime="2026-04-20 07:50:24.436112551 +0000 UTC m=+21.609439984" Apr 20 07:50:24.466239 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:24.466185 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-ggtd4" podStartSLOduration=9.089228489 podStartE2EDuration="21.466172058s" podCreationTimestamp="2026-04-20 07:50:03 +0000 UTC" firstStartedPulling="2026-04-20 07:50:06.221471984 +0000 UTC m=+3.394799397" lastFinishedPulling="2026-04-20 07:50:18.598415545 +0000 UTC m=+15.771742966" observedRunningTime="2026-04-20 07:50:24.465255107 +0000 UTC m=+21.638582536" watchObservedRunningTime="2026-04-20 07:50:24.466172058 +0000 UTC m=+21.639499723" Apr 20 07:50:24.479583 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:24.478451 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-z54dw" podStartSLOduration=4.779983715 podStartE2EDuration="21.478432259s" podCreationTimestamp="2026-04-20 07:50:03 +0000 UTC" firstStartedPulling="2026-04-20 07:50:06.249834198 +0000 UTC m=+3.423161619" lastFinishedPulling="2026-04-20 07:50:22.948282742 +0000 UTC m=+20.121610163" observedRunningTime="2026-04-20 07:50:24.477390146 +0000 UTC m=+21.650717576" watchObservedRunningTime="2026-04-20 07:50:24.478432259 +0000 UTC m=+21.651759688" Apr 20 07:50:25.027021 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:25.026999 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 07:50:25.285935 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:25.285857 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:25.285935 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:25.285923 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:25.286117 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:25.286015 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sfl6t" podUID="ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06" Apr 20 07:50:25.286117 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:25.286022 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:25.286199 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:25.286112 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qpdh" podUID="46b8deca-b33b-45e1-9131-b47fde192a78" Apr 20 07:50:25.286236 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:25.286213 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv6pl" podUID="036e621d-2979-4552-ac5c-6fab5743df3a" Apr 20 07:50:25.294413 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:25.294316 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T07:50:25.027017394Z","UUID":"786da1f1-874c-4a69-88b2-cc139821860e","Handler":null,"Name":"","Endpoint":""} Apr 20 07:50:25.296920 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:25.296897 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 07:50:25.297034 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:25.296933 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 07:50:25.426051 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:25.426006 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" event={"ID":"9d731404-e3dc-45b5-a6c6-4d250a44a923","Type":"ContainerStarted","Data":"2a4bffad4ee8f5307f55231ab942a852d39bcd18bcf4e5b38ac7d63848d1677c"} Apr 20 07:50:25.427544 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:25.427511 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-459v4" event={"ID":"f8898642-a099-48c6-ba9f-0a5099d78d5c","Type":"ContainerStarted","Data":"03a1a59ce60915a952118915f16795a746efe71772ec80fcaef424d4b7d37220"} Apr 20 07:50:25.450888 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:25.450845 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-459v4" podStartSLOduration=5.431432207 podStartE2EDuration="22.450832457s" podCreationTimestamp="2026-04-20 07:50:03 +0000 UTC" firstStartedPulling="2026-04-20 07:50:06.249648084 +0000 UTC m=+3.422975492" lastFinishedPulling="2026-04-20 07:50:23.269048335 +0000 UTC m=+20.442375742" observedRunningTime="2026-04-20 07:50:25.450477997 +0000 UTC m=+22.623805426" watchObservedRunningTime="2026-04-20 07:50:25.450832457 +0000 UTC m=+22.624159921" Apr 20 07:50:25.451012 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:25.450917 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-gt9w2" podStartSLOduration=5.378368759 podStartE2EDuration="22.450913354s" podCreationTimestamp="2026-04-20 07:50:03 +0000 UTC" firstStartedPulling="2026-04-20 07:50:06.226016771 +0000 UTC m=+3.399344190" lastFinishedPulling="2026-04-20 07:50:23.298561368 +0000 UTC m=+20.471888785" observedRunningTime="2026-04-20 07:50:24.492482264 +0000 UTC m=+21.665809693" watchObservedRunningTime="2026-04-20 07:50:25.450913354 +0000 UTC m=+22.624240788" Apr 20 07:50:26.120550 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:26.120526 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-ggtd4" Apr 20 07:50:26.121054 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:26.121036 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-ggtd4" Apr 20 07:50:26.433310 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:26.433052 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovn-acl-logging/0.log" Apr 20 07:50:26.433814 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:26.433668 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" event={"ID":"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf","Type":"ContainerStarted","Data":"552e573a20ee7d74149e0a530bb20c048bd4fbf0e2589d24b6249681702afaa7"} Apr 20 07:50:26.436041 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:26.435556 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" event={"ID":"9d731404-e3dc-45b5-a6c6-4d250a44a923","Type":"ContainerStarted","Data":"f7aa1f4023532510f94ec82a9f05a501ef20de0813ece00b2aaab71eec3f7cd2"} Apr 20 07:50:26.436041 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:26.435893 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-ggtd4" Apr 20 07:50:26.436335 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:26.436318 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-ggtd4" Apr 20 07:50:26.455582 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:26.455535 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-24jx4" podStartSLOduration=3.420575584 podStartE2EDuration="23.455522215s" podCreationTimestamp="2026-04-20 07:50:03 +0000 UTC" firstStartedPulling="2026-04-20 07:50:06.217254632 +0000 UTC m=+3.390582050" lastFinishedPulling="2026-04-20 07:50:26.252201273 +0000 UTC m=+23.425528681" observedRunningTime="2026-04-20 07:50:26.45498284 +0000 UTC m=+23.628310292" watchObservedRunningTime="2026-04-20 07:50:26.455522215 +0000 UTC m=+23.628849643" Apr 20 07:50:27.286687 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:27.286653 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:27.286916 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:27.286657 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:27.286916 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:27.286657 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:27.286916 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:27.286862 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qpdh" podUID="46b8deca-b33b-45e1-9131-b47fde192a78" Apr 20 07:50:27.287048 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:27.286750 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sfl6t" podUID="ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06" Apr 20 07:50:27.287048 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:27.286922 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv6pl" podUID="036e621d-2979-4552-ac5c-6fab5743df3a" Apr 20 07:50:29.285951 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:29.285706 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:29.286535 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:29.285706 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:29.286535 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:29.285738 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:29.286535 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:29.286081 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qpdh" podUID="46b8deca-b33b-45e1-9131-b47fde192a78" Apr 20 07:50:29.286535 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:29.286176 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sfl6t" podUID="ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06" Apr 20 07:50:29.286535 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:29.286262 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv6pl" podUID="036e621d-2979-4552-ac5c-6fab5743df3a" Apr 20 07:50:29.442427 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:29.442395 2575 generic.go:358] "Generic (PLEG): container finished" podID="ba7d0588-c3eb-4849-ae3b-630be7fcc621" containerID="eba83f0cf8998b3ad9aa3f7e75f8ba012245255628446a40c855483d90c5ec96" exitCode=0 Apr 20 07:50:29.442576 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:29.442474 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzzk5" event={"ID":"ba7d0588-c3eb-4849-ae3b-630be7fcc621","Type":"ContainerDied","Data":"eba83f0cf8998b3ad9aa3f7e75f8ba012245255628446a40c855483d90c5ec96"} Apr 20 07:50:29.445348 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:29.445300 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovn-acl-logging/0.log" Apr 20 07:50:29.445681 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:29.445662 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" event={"ID":"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf","Type":"ContainerStarted","Data":"e65cb4022e1ff0179396bd0361bb14cbe740c00fbd02823dfbd4f640eeb581ef"} Apr 20 07:50:29.445959 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:29.445937 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:29.446065 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:29.445966 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:29.446112 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:29.446077 2575 scope.go:117] "RemoveContainer" containerID="a21bfa697ae927213bdc4573ac0db294f0e0d6e0dd7f616a2c88b5c71765883f" Apr 20 07:50:29.460478 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:29.460460 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:30.449701 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:30.449669 2575 generic.go:358] "Generic (PLEG): container finished" podID="ba7d0588-c3eb-4849-ae3b-630be7fcc621" containerID="c64117f7923a4f505f50b6321217e94489c997c958aaba865abc14ff3273e951" exitCode=0 Apr 20 07:50:30.450105 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:30.449752 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzzk5" event={"ID":"ba7d0588-c3eb-4849-ae3b-630be7fcc621","Type":"ContainerDied","Data":"c64117f7923a4f505f50b6321217e94489c997c958aaba865abc14ff3273e951"} Apr 20 07:50:30.452967 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:30.452948 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovn-acl-logging/0.log" Apr 20 07:50:30.453273 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:30.453251 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" event={"ID":"f3a24a42-45b7-4726-bc7c-32d9f9d61eaf","Type":"ContainerStarted","Data":"d700eb9d8d2909dd9cde79fd8745b0f1b189cb5e417d2551637026b43016b645"} Apr 20 07:50:30.453501 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:30.453490 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:30.467422 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:30.467402 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:50:30.494392 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:30.494351 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" podStartSLOduration=10.310033908 podStartE2EDuration="27.49433892s" podCreationTimestamp="2026-04-20 07:50:03 +0000 UTC" firstStartedPulling="2026-04-20 07:50:06.219198009 +0000 UTC m=+3.392525417" lastFinishedPulling="2026-04-20 07:50:23.403503021 +0000 UTC m=+20.576830429" observedRunningTime="2026-04-20 07:50:30.492749937 +0000 UTC m=+27.666077368" watchObservedRunningTime="2026-04-20 07:50:30.49433892 +0000 UTC m=+27.667666349" Apr 20 07:50:30.958782 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:30.958749 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7qpdh"] Apr 20 07:50:30.958954 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:30.958905 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:30.959053 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:30.959023 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qpdh" podUID="46b8deca-b33b-45e1-9131-b47fde192a78" Apr 20 07:50:30.961397 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:30.961102 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nv6pl"] Apr 20 07:50:30.961397 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:30.961227 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:30.961397 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:30.961358 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv6pl" podUID="036e621d-2979-4552-ac5c-6fab5743df3a" Apr 20 07:50:30.961983 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:30.961957 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sfl6t"] Apr 20 07:50:30.962347 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:30.962065 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:30.962347 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:30.962152 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sfl6t" podUID="ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06" Apr 20 07:50:31.456934 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:31.456900 2575 generic.go:358] "Generic (PLEG): container finished" podID="ba7d0588-c3eb-4849-ae3b-630be7fcc621" containerID="2d0aceb3db3e2f5ee6b0e03c327b1dab02e20bd156be505b194614aa411b923b" exitCode=0 Apr 20 07:50:31.457355 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:31.456982 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzzk5" event={"ID":"ba7d0588-c3eb-4849-ae3b-630be7fcc621","Type":"ContainerDied","Data":"2d0aceb3db3e2f5ee6b0e03c327b1dab02e20bd156be505b194614aa411b923b"} Apr 20 07:50:32.286731 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:32.286688 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:32.286908 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:32.286740 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:32.286908 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:32.286782 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:32.286908 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:32.286870 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv6pl" podUID="036e621d-2979-4552-ac5c-6fab5743df3a" Apr 20 07:50:32.287039 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:32.286990 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sfl6t" podUID="ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06" Apr 20 07:50:32.287083 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:32.287072 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qpdh" podUID="46b8deca-b33b-45e1-9131-b47fde192a78" Apr 20 07:50:34.286212 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:34.285965 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:34.286697 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:34.285961 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:34.286697 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:34.286314 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv6pl" podUID="036e621d-2979-4552-ac5c-6fab5743df3a" Apr 20 07:50:34.286697 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:34.286380 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qpdh" podUID="46b8deca-b33b-45e1-9131-b47fde192a78" Apr 20 07:50:34.286697 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:34.285980 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:34.286697 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:34.286458 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sfl6t" podUID="ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06" Apr 20 07:50:36.286197 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.286163 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:36.286197 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.286191 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:36.286864 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.286163 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:36.286864 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:36.286294 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7qpdh" podUID="46b8deca-b33b-45e1-9131-b47fde192a78" Apr 20 07:50:36.286864 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:36.286375 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sfl6t" podUID="ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06" Apr 20 07:50:36.286864 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:36.286442 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nv6pl" podUID="036e621d-2979-4552-ac5c-6fab5743df3a" Apr 20 07:50:36.634700 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.634671 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-24.ec2.internal" event="NodeReady" Apr 20 07:50:36.634856 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.634835 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 07:50:36.665746 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.665719 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7bd5554bd9-jrm2x"] Apr 20 07:50:36.669695 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.669669 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-smjd7"] Apr 20 07:50:36.669851 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.669830 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.672153 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.672129 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 07:50:36.672449 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.672378 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 07:50:36.672449 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.672380 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-68mlz\"" Apr 20 07:50:36.672449 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.672428 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 07:50:36.672742 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.672718 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-smjd7" Apr 20 07:50:36.678128 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.678103 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-qlz5k\"" Apr 20 07:50:36.678256 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.678132 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 07:50:36.678256 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.678155 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 07:50:36.680367 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.680113 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 07:50:36.681412 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.681201 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-smjd7"] Apr 20 07:50:36.682259 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.682225 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7bd5554bd9-jrm2x"] Apr 20 07:50:36.683737 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.683718 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mqbmm"] Apr 20 07:50:36.686704 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.686684 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mqbmm" Apr 20 07:50:36.689276 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.689253 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 07:50:36.689413 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.689399 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mxq77\"" Apr 20 07:50:36.689480 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.689457 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 07:50:36.689686 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.689582 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 07:50:36.695498 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.695058 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mqbmm"] Apr 20 07:50:36.785007 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.784971 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jp28h"] Apr 20 07:50:36.787988 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.787958 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/397df66e-7585-4bc3-aaa7-dcdc37635fc5-installation-pull-secrets\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.788134 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.788035 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.788134 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.788076 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z7nn\" (UniqueName: \"kubernetes.io/projected/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-kube-api-access-8z7nn\") pod \"ingress-canary-mqbmm\" (UID: \"5eaa06ed-8495-4d3f-a0e2-d67bb19b8695\") " pod="openshift-ingress-canary/ingress-canary-mqbmm" Apr 20 07:50:36.788276 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.788141 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-certificates\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.788276 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.788176 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-bound-sa-token\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.788276 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.788188 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jp28h" Apr 20 07:50:36.788276 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.788222 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2e4b6f0f-88ff-49a7-ada1-7af9515863da-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-smjd7\" (UID: \"2e4b6f0f-88ff-49a7-ada1-7af9515863da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-smjd7" Apr 20 07:50:36.788276 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.788254 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-smjd7\" (UID: \"2e4b6f0f-88ff-49a7-ada1-7af9515863da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-smjd7" Apr 20 07:50:36.788502 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.788289 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr5h5\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-kube-api-access-vr5h5\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.788502 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.788323 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/397df66e-7585-4bc3-aaa7-dcdc37635fc5-trusted-ca\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.788502 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.788345 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert\") pod \"ingress-canary-mqbmm\" (UID: \"5eaa06ed-8495-4d3f-a0e2-d67bb19b8695\") " pod="openshift-ingress-canary/ingress-canary-mqbmm" Apr 20 07:50:36.788502 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.788377 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/397df66e-7585-4bc3-aaa7-dcdc37635fc5-ca-trust-extracted\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.788502 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.788403 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/397df66e-7585-4bc3-aaa7-dcdc37635fc5-image-registry-private-configuration\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.791075 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.790839 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 07:50:36.791075 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.790861 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 07:50:36.791075 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.790893 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dvf6t\"" Apr 20 07:50:36.798316 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.798294 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jp28h"] Apr 20 07:50:36.889344 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.889263 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/397df66e-7585-4bc3-aaa7-dcdc37635fc5-trusted-ca\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.889344 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.889303 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert\") pod \"ingress-canary-mqbmm\" (UID: \"5eaa06ed-8495-4d3f-a0e2-d67bb19b8695\") " pod="openshift-ingress-canary/ingress-canary-mqbmm" Apr 20 07:50:36.889573 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:36.889417 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:50:36.889573 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.889442 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/397df66e-7585-4bc3-aaa7-dcdc37635fc5-ca-trust-extracted\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.889573 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:36.889481 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert podName:5eaa06ed-8495-4d3f-a0e2-d67bb19b8695 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:37.389459441 +0000 UTC m=+34.562786853 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert") pod "ingress-canary-mqbmm" (UID: "5eaa06ed-8495-4d3f-a0e2-d67bb19b8695") : secret "canary-serving-cert" not found Apr 20 07:50:36.889573 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.889513 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/397df66e-7585-4bc3-aaa7-dcdc37635fc5-image-registry-private-configuration\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.889573 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.889547 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eccac7f6-7962-45d6-9141-b02deb90631f-tmp-dir\") pod \"dns-default-jp28h\" (UID: \"eccac7f6-7962-45d6-9141-b02deb90631f\") " pod="openshift-dns/dns-default-jp28h" Apr 20 07:50:36.889881 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.889605 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/397df66e-7585-4bc3-aaa7-dcdc37635fc5-installation-pull-secrets\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.889881 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.889683 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.889881 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.889712 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8z7nn\" (UniqueName: \"kubernetes.io/projected/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-kube-api-access-8z7nn\") pod \"ingress-canary-mqbmm\" (UID: \"5eaa06ed-8495-4d3f-a0e2-d67bb19b8695\") " pod="openshift-ingress-canary/ingress-canary-mqbmm" Apr 20 07:50:36.889881 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.889741 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gft66\" (UniqueName: \"kubernetes.io/projected/eccac7f6-7962-45d6-9141-b02deb90631f-kube-api-access-gft66\") pod \"dns-default-jp28h\" (UID: \"eccac7f6-7962-45d6-9141-b02deb90631f\") " pod="openshift-dns/dns-default-jp28h" Apr 20 07:50:36.889881 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.889772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-certificates\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.889881 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.889797 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-bound-sa-token\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.889881 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.889824 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls\") pod \"dns-default-jp28h\" (UID: \"eccac7f6-7962-45d6-9141-b02deb90631f\") " pod="openshift-dns/dns-default-jp28h" Apr 20 07:50:36.889881 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.889854 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2e4b6f0f-88ff-49a7-ada1-7af9515863da-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-smjd7\" (UID: \"2e4b6f0f-88ff-49a7-ada1-7af9515863da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-smjd7" Apr 20 07:50:36.890249 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.889889 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-smjd7\" (UID: \"2e4b6f0f-88ff-49a7-ada1-7af9515863da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-smjd7" Apr 20 07:50:36.890249 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.889921 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vr5h5\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-kube-api-access-vr5h5\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.890249 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.889947 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eccac7f6-7962-45d6-9141-b02deb90631f-config-volume\") pod \"dns-default-jp28h\" (UID: \"eccac7f6-7962-45d6-9141-b02deb90631f\") " pod="openshift-dns/dns-default-jp28h" Apr 20 07:50:36.890249 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.890016 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/397df66e-7585-4bc3-aaa7-dcdc37635fc5-ca-trust-extracted\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.890249 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:36.890082 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 07:50:36.890249 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:36.890095 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bd5554bd9-jrm2x: secret "image-registry-tls" not found Apr 20 07:50:36.890249 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:36.890135 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls podName:397df66e-7585-4bc3-aaa7-dcdc37635fc5 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:37.390120646 +0000 UTC m=+34.563448072 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls") pod "image-registry-7bd5554bd9-jrm2x" (UID: "397df66e-7585-4bc3-aaa7-dcdc37635fc5") : secret "image-registry-tls" not found Apr 20 07:50:36.890249 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:36.890236 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 07:50:36.890709 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:36.890273 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert podName:2e4b6f0f-88ff-49a7-ada1-7af9515863da nodeName:}" failed. No retries permitted until 2026-04-20 07:50:37.390262098 +0000 UTC m=+34.563589509 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-smjd7" (UID: "2e4b6f0f-88ff-49a7-ada1-7af9515863da") : secret "networking-console-plugin-cert" not found Apr 20 07:50:36.890709 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.890486 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/397df66e-7585-4bc3-aaa7-dcdc37635fc5-trusted-ca\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.890876 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.890855 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2e4b6f0f-88ff-49a7-ada1-7af9515863da-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-smjd7\" (UID: \"2e4b6f0f-88ff-49a7-ada1-7af9515863da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-smjd7" Apr 20 07:50:36.890931 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.890861 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-certificates\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.893980 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.893960 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/397df66e-7585-4bc3-aaa7-dcdc37635fc5-installation-pull-secrets\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.894079 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.893981 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/397df66e-7585-4bc3-aaa7-dcdc37635fc5-image-registry-private-configuration\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.903320 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.903300 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-bound-sa-token\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.905814 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.905792 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z7nn\" (UniqueName: \"kubernetes.io/projected/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-kube-api-access-8z7nn\") pod \"ingress-canary-mqbmm\" (UID: \"5eaa06ed-8495-4d3f-a0e2-d67bb19b8695\") " pod="openshift-ingress-canary/ingress-canary-mqbmm" Apr 20 07:50:36.905915 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.905890 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr5h5\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-kube-api-access-vr5h5\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:36.991105 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.991068 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls\") pod \"dns-default-jp28h\" (UID: \"eccac7f6-7962-45d6-9141-b02deb90631f\") " pod="openshift-dns/dns-default-jp28h" Apr 20 07:50:36.991286 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.991122 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eccac7f6-7962-45d6-9141-b02deb90631f-config-volume\") pod \"dns-default-jp28h\" (UID: \"eccac7f6-7962-45d6-9141-b02deb90631f\") " pod="openshift-dns/dns-default-jp28h" Apr 20 07:50:36.991286 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.991170 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eccac7f6-7962-45d6-9141-b02deb90631f-tmp-dir\") pod \"dns-default-jp28h\" (UID: \"eccac7f6-7962-45d6-9141-b02deb90631f\") " pod="openshift-dns/dns-default-jp28h" Apr 20 07:50:36.991286 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.991210 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs\") pod \"network-metrics-daemon-7qpdh\" (UID: \"46b8deca-b33b-45e1-9131-b47fde192a78\") " pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:36.991286 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:36.991230 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:50:36.991286 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.991259 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gft66\" (UniqueName: \"kubernetes.io/projected/eccac7f6-7962-45d6-9141-b02deb90631f-kube-api-access-gft66\") pod \"dns-default-jp28h\" (UID: \"eccac7f6-7962-45d6-9141-b02deb90631f\") " pod="openshift-dns/dns-default-jp28h" Apr 20 07:50:36.991520 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:36.991314 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls podName:eccac7f6-7962-45d6-9141-b02deb90631f nodeName:}" failed. No retries permitted until 2026-04-20 07:50:37.491292347 +0000 UTC m=+34.664619767 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls") pod "dns-default-jp28h" (UID: "eccac7f6-7962-45d6-9141-b02deb90631f") : secret "dns-default-metrics-tls" not found Apr 20 07:50:36.991690 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.991667 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eccac7f6-7962-45d6-9141-b02deb90631f-tmp-dir\") pod \"dns-default-jp28h\" (UID: \"eccac7f6-7962-45d6-9141-b02deb90631f\") " pod="openshift-dns/dns-default-jp28h" Apr 20 07:50:36.991794 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:36.991680 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:36.991794 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:36.991764 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs podName:46b8deca-b33b-45e1-9131-b47fde192a78 nodeName:}" failed. No retries permitted until 2026-04-20 07:51:08.991747724 +0000 UTC m=+66.165075133 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs") pod "network-metrics-daemon-7qpdh" (UID: "46b8deca-b33b-45e1-9131-b47fde192a78") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:36.991884 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:36.991868 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eccac7f6-7962-45d6-9141-b02deb90631f-config-volume\") pod \"dns-default-jp28h\" (UID: \"eccac7f6-7962-45d6-9141-b02deb90631f\") " pod="openshift-dns/dns-default-jp28h" Apr 20 07:50:37.001190 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:37.001163 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gft66\" (UniqueName: \"kubernetes.io/projected/eccac7f6-7962-45d6-9141-b02deb90631f-kube-api-access-gft66\") pod \"dns-default-jp28h\" (UID: \"eccac7f6-7962-45d6-9141-b02deb90631f\") " pod="openshift-dns/dns-default-jp28h" Apr 20 07:50:37.192681 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:37.192585 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nc7c\" (UniqueName: \"kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c\") pod \"network-check-target-sfl6t\" (UID: \"ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06\") " pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:37.192854 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:37.192784 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:50:37.192854 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:37.192810 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:50:37.192854 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:37.192830 2575 projected.go:194] Error preparing data for projected volume kube-api-access-8nc7c for pod openshift-network-diagnostics/network-check-target-sfl6t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:37.193000 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:37.192898 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c podName:ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06 nodeName:}" failed. No retries permitted until 2026-04-20 07:51:09.192878042 +0000 UTC m=+66.366205452 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-8nc7c" (UniqueName: "kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c") pod "network-check-target-sfl6t" (UID: "ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:37.394024 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:37.393985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:37.394677 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:37.394046 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-smjd7\" (UID: \"2e4b6f0f-88ff-49a7-ada1-7af9515863da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-smjd7" Apr 20 07:50:37.394677 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:37.394136 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 07:50:37.394677 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:37.394172 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 07:50:37.394677 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:37.394194 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bd5554bd9-jrm2x: secret "image-registry-tls" not found Apr 20 07:50:37.394677 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:37.394205 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert podName:2e4b6f0f-88ff-49a7-ada1-7af9515863da nodeName:}" failed. No retries permitted until 2026-04-20 07:50:38.394192306 +0000 UTC m=+35.567519713 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-smjd7" (UID: "2e4b6f0f-88ff-49a7-ada1-7af9515863da") : secret "networking-console-plugin-cert" not found Apr 20 07:50:37.394677 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:37.394220 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert\") pod \"ingress-canary-mqbmm\" (UID: \"5eaa06ed-8495-4d3f-a0e2-d67bb19b8695\") " pod="openshift-ingress-canary/ingress-canary-mqbmm" Apr 20 07:50:37.394677 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:37.394260 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls podName:397df66e-7585-4bc3-aaa7-dcdc37635fc5 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:38.394243056 +0000 UTC m=+35.567570477 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls") pod "image-registry-7bd5554bd9-jrm2x" (UID: "397df66e-7585-4bc3-aaa7-dcdc37635fc5") : secret "image-registry-tls" not found Apr 20 07:50:37.394677 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:37.394321 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:50:37.394677 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:37.394347 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert podName:5eaa06ed-8495-4d3f-a0e2-d67bb19b8695 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:38.394338888 +0000 UTC m=+35.567666296 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert") pod "ingress-canary-mqbmm" (UID: "5eaa06ed-8495-4d3f-a0e2-d67bb19b8695") : secret "canary-serving-cert" not found Apr 20 07:50:37.495126 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:37.495041 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls\") pod \"dns-default-jp28h\" (UID: \"eccac7f6-7962-45d6-9141-b02deb90631f\") " pod="openshift-dns/dns-default-jp28h" Apr 20 07:50:37.495273 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:37.495182 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:50:37.495273 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:37.495242 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls podName:eccac7f6-7962-45d6-9141-b02deb90631f nodeName:}" failed. No retries permitted until 2026-04-20 07:50:38.495227844 +0000 UTC m=+35.668555251 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls") pod "dns-default-jp28h" (UID: "eccac7f6-7962-45d6-9141-b02deb90631f") : secret "dns-default-metrics-tls" not found Apr 20 07:50:38.286316 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:38.286275 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:50:38.286316 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:38.286307 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:38.286557 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:38.286423 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:50:38.289298 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:38.289278 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 07:50:38.290554 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:38.290538 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x6tc5\"" Apr 20 07:50:38.290599 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:38.290555 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2g69h\"" Apr 20 07:50:38.290599 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:38.290575 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 07:50:38.290700 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:38.290671 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 07:50:38.290751 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:38.290676 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 07:50:38.401572 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:38.401542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:38.402067 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:38.401592 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-smjd7\" (UID: \"2e4b6f0f-88ff-49a7-ada1-7af9515863da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-smjd7" Apr 20 07:50:38.402067 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:38.401642 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert\") pod \"ingress-canary-mqbmm\" (UID: \"5eaa06ed-8495-4d3f-a0e2-d67bb19b8695\") " pod="openshift-ingress-canary/ingress-canary-mqbmm" Apr 20 07:50:38.402067 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:38.401715 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 07:50:38.402067 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:38.401734 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bd5554bd9-jrm2x: secret "image-registry-tls" not found Apr 20 07:50:38.402067 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:38.401787 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls podName:397df66e-7585-4bc3-aaa7-dcdc37635fc5 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:40.40176823 +0000 UTC m=+37.575095640 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls") pod "image-registry-7bd5554bd9-jrm2x" (UID: "397df66e-7585-4bc3-aaa7-dcdc37635fc5") : secret "image-registry-tls" not found Apr 20 07:50:38.402067 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:38.401787 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 07:50:38.402067 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:38.401849 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert podName:2e4b6f0f-88ff-49a7-ada1-7af9515863da nodeName:}" failed. No retries permitted until 2026-04-20 07:50:40.401833202 +0000 UTC m=+37.575160612 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-smjd7" (UID: "2e4b6f0f-88ff-49a7-ada1-7af9515863da") : secret "networking-console-plugin-cert" not found Apr 20 07:50:38.402067 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:38.401915 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:50:38.402067 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:38.401973 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert podName:5eaa06ed-8495-4d3f-a0e2-d67bb19b8695 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:40.401960335 +0000 UTC m=+37.575287756 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert") pod "ingress-canary-mqbmm" (UID: "5eaa06ed-8495-4d3f-a0e2-d67bb19b8695") : secret "canary-serving-cert" not found Apr 20 07:50:38.472078 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:38.472051 2575 generic.go:358] "Generic (PLEG): container finished" podID="ba7d0588-c3eb-4849-ae3b-630be7fcc621" containerID="77959edbad81b93a06898181aa6dce1aa27d973ce83f9ec2c7a6c78e1513fa89" exitCode=0 Apr 20 07:50:38.472205 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:38.472103 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzzk5" event={"ID":"ba7d0588-c3eb-4849-ae3b-630be7fcc621","Type":"ContainerDied","Data":"77959edbad81b93a06898181aa6dce1aa27d973ce83f9ec2c7a6c78e1513fa89"} Apr 20 07:50:38.503798 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:38.502348 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls\") pod \"dns-default-jp28h\" (UID: \"eccac7f6-7962-45d6-9141-b02deb90631f\") " pod="openshift-dns/dns-default-jp28h" Apr 20 07:50:38.503798 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:38.502534 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:50:38.503798 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:38.502588 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls podName:eccac7f6-7962-45d6-9141-b02deb90631f nodeName:}" failed. No retries permitted until 2026-04-20 07:50:40.502571667 +0000 UTC m=+37.675899080 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls") pod "dns-default-jp28h" (UID: "eccac7f6-7962-45d6-9141-b02deb90631f") : secret "dns-default-metrics-tls" not found Apr 20 07:50:39.308785 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:39.308749 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/036e621d-2979-4552-ac5c-6fab5743df3a-original-pull-secret\") pod \"global-pull-secret-syncer-nv6pl\" (UID: \"036e621d-2979-4552-ac5c-6fab5743df3a\") " pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:39.311027 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:39.310998 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/036e621d-2979-4552-ac5c-6fab5743df3a-original-pull-secret\") pod \"global-pull-secret-syncer-nv6pl\" (UID: \"036e621d-2979-4552-ac5c-6fab5743df3a\") " pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:39.475884 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:39.475855 2575 generic.go:358] "Generic (PLEG): container finished" podID="ba7d0588-c3eb-4849-ae3b-630be7fcc621" containerID="0edfda9a9501aec1061fd9113a8e78306c1219e09a524f93c04dadb2f49f79a7" exitCode=0 Apr 20 07:50:39.475884 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:39.475889 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzzk5" event={"ID":"ba7d0588-c3eb-4849-ae3b-630be7fcc621","Type":"ContainerDied","Data":"0edfda9a9501aec1061fd9113a8e78306c1219e09a524f93c04dadb2f49f79a7"} Apr 20 07:50:39.507088 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:39.506854 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nv6pl" Apr 20 07:50:39.650240 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:39.650209 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nv6pl"] Apr 20 07:50:39.654025 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:39.653997 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod036e621d_2979_4552_ac5c_6fab5743df3a.slice/crio-f1bd10f5d10c2a708992a7a05631c1bd6ccf9ea5f94039d9027255f1181329d1 WatchSource:0}: Error finding container f1bd10f5d10c2a708992a7a05631c1bd6ccf9ea5f94039d9027255f1181329d1: Status 404 returned error can't find the container with id f1bd10f5d10c2a708992a7a05631c1bd6ccf9ea5f94039d9027255f1181329d1 Apr 20 07:50:40.419156 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:40.419120 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert\") pod \"ingress-canary-mqbmm\" (UID: \"5eaa06ed-8495-4d3f-a0e2-d67bb19b8695\") " pod="openshift-ingress-canary/ingress-canary-mqbmm" Apr 20 07:50:40.419333 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:40.419298 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:50:40.419393 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:40.419304 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:40.419393 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:40.419375 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert podName:5eaa06ed-8495-4d3f-a0e2-d67bb19b8695 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:44.419352303 +0000 UTC m=+41.592679715 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert") pod "ingress-canary-mqbmm" (UID: "5eaa06ed-8495-4d3f-a0e2-d67bb19b8695") : secret "canary-serving-cert" not found Apr 20 07:50:40.419495 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:40.419377 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 07:50:40.419495 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:40.419459 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-smjd7\" (UID: \"2e4b6f0f-88ff-49a7-ada1-7af9515863da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-smjd7" Apr 20 07:50:40.419495 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:40.419480 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bd5554bd9-jrm2x: secret "image-registry-tls" not found Apr 20 07:50:40.419635 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:40.419509 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 07:50:40.419635 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:40.419537 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls podName:397df66e-7585-4bc3-aaa7-dcdc37635fc5 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:44.419518889 +0000 UTC m=+41.592846309 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls") pod "image-registry-7bd5554bd9-jrm2x" (UID: "397df66e-7585-4bc3-aaa7-dcdc37635fc5") : secret "image-registry-tls" not found Apr 20 07:50:40.419635 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:40.419555 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert podName:2e4b6f0f-88ff-49a7-ada1-7af9515863da nodeName:}" failed. No retries permitted until 2026-04-20 07:50:44.41954561 +0000 UTC m=+41.592873024 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-smjd7" (UID: "2e4b6f0f-88ff-49a7-ada1-7af9515863da") : secret "networking-console-plugin-cert" not found Apr 20 07:50:40.479678 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:40.479646 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nv6pl" event={"ID":"036e621d-2979-4552-ac5c-6fab5743df3a","Type":"ContainerStarted","Data":"f1bd10f5d10c2a708992a7a05631c1bd6ccf9ea5f94039d9027255f1181329d1"} Apr 20 07:50:40.482977 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:40.482948 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hzzk5" event={"ID":"ba7d0588-c3eb-4849-ae3b-630be7fcc621","Type":"ContainerStarted","Data":"9b6ab33956795447ff819187e480919b3c8f369966b7d144b4e5763c3b5b977f"} Apr 20 07:50:40.505281 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:40.505223 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hzzk5" podStartSLOduration=6.211093513 podStartE2EDuration="37.505204775s" podCreationTimestamp="2026-04-20 07:50:03 +0000 UTC" firstStartedPulling="2026-04-20 07:50:06.249884223 +0000 UTC m=+3.423211636" lastFinishedPulling="2026-04-20 07:50:37.543995488 +0000 UTC m=+34.717322898" observedRunningTime="2026-04-20 07:50:40.504432223 +0000 UTC m=+37.677759654" watchObservedRunningTime="2026-04-20 07:50:40.505204775 +0000 UTC m=+37.678532204" Apr 20 07:50:40.520701 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:40.520608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls\") pod \"dns-default-jp28h\" (UID: \"eccac7f6-7962-45d6-9141-b02deb90631f\") " pod="openshift-dns/dns-default-jp28h" Apr 20 07:50:40.520884 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:40.520771 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:50:40.520884 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:40.520844 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls podName:eccac7f6-7962-45d6-9141-b02deb90631f nodeName:}" failed. No retries permitted until 2026-04-20 07:50:44.520829805 +0000 UTC m=+41.694157216 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls") pod "dns-default-jp28h" (UID: "eccac7f6-7962-45d6-9141-b02deb90631f") : secret "dns-default-metrics-tls" not found Apr 20 07:50:44.452543 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:44.452494 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:44.452963 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:44.452652 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 07:50:44.452963 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:44.452671 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bd5554bd9-jrm2x: secret "image-registry-tls" not found Apr 20 07:50:44.452963 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:44.452669 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-smjd7\" (UID: \"2e4b6f0f-88ff-49a7-ada1-7af9515863da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-smjd7" Apr 20 07:50:44.452963 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:44.452718 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls podName:397df66e-7585-4bc3-aaa7-dcdc37635fc5 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:52.452704082 +0000 UTC m=+49.626031489 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls") pod "image-registry-7bd5554bd9-jrm2x" (UID: "397df66e-7585-4bc3-aaa7-dcdc37635fc5") : secret "image-registry-tls" not found Apr 20 07:50:44.452963 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:44.452734 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert\") pod \"ingress-canary-mqbmm\" (UID: \"5eaa06ed-8495-4d3f-a0e2-d67bb19b8695\") " pod="openshift-ingress-canary/ingress-canary-mqbmm" Apr 20 07:50:44.452963 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:44.452768 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 07:50:44.452963 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:44.452818 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert podName:2e4b6f0f-88ff-49a7-ada1-7af9515863da nodeName:}" failed. No retries permitted until 2026-04-20 07:50:52.452804529 +0000 UTC m=+49.626131941 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-smjd7" (UID: "2e4b6f0f-88ff-49a7-ada1-7af9515863da") : secret "networking-console-plugin-cert" not found Apr 20 07:50:44.452963 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:44.452830 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:50:44.452963 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:44.452855 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert podName:5eaa06ed-8495-4d3f-a0e2-d67bb19b8695 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:52.452847448 +0000 UTC m=+49.626174855 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert") pod "ingress-canary-mqbmm" (UID: "5eaa06ed-8495-4d3f-a0e2-d67bb19b8695") : secret "canary-serving-cert" not found Apr 20 07:50:44.491973 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:44.491938 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nv6pl" event={"ID":"036e621d-2979-4552-ac5c-6fab5743df3a","Type":"ContainerStarted","Data":"139172bb3ef5a22daef394c670c22b2deb8b59b26e17dc19b10c7bdbb6ea7a6d"} Apr 20 07:50:44.505720 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:44.505674 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-nv6pl" podStartSLOduration=33.61523271 podStartE2EDuration="37.505658866s" podCreationTimestamp="2026-04-20 07:50:07 +0000 UTC" firstStartedPulling="2026-04-20 07:50:39.656067864 +0000 UTC m=+36.829395271" lastFinishedPulling="2026-04-20 07:50:43.546494021 +0000 UTC m=+40.719821427" observedRunningTime="2026-04-20 07:50:44.504573236 +0000 UTC m=+41.677900666" watchObservedRunningTime="2026-04-20 07:50:44.505658866 +0000 UTC m=+41.678986296" Apr 20 07:50:44.553500 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:44.553460 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls\") pod \"dns-default-jp28h\" (UID: \"eccac7f6-7962-45d6-9141-b02deb90631f\") " pod="openshift-dns/dns-default-jp28h" Apr 20 07:50:44.553673 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:44.553564 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:50:44.553673 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:44.553655 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls podName:eccac7f6-7962-45d6-9141-b02deb90631f nodeName:}" failed. No retries permitted until 2026-04-20 07:50:52.55363509 +0000 UTC m=+49.726962512 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls") pod "dns-default-jp28h" (UID: "eccac7f6-7962-45d6-9141-b02deb90631f") : secret "dns-default-metrics-tls" not found Apr 20 07:50:49.346361 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:49.346331 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp"] Apr 20 07:50:49.374139 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:49.374099 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp"] Apr 20 07:50:49.374282 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:49.374212 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp" Apr 20 07:50:49.376997 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:49.376975 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 07:50:49.377116 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:49.377011 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 07:50:49.378075 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:49.378057 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 07:50:49.378193 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:49.378097 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 07:50:49.380807 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:49.380599 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b-klusterlet-config\") pod \"klusterlet-addon-workmgr-9987b544c-b9jrp\" (UID: \"bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp" Apr 20 07:50:49.380807 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:49.380786 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b-tmp\") pod \"klusterlet-addon-workmgr-9987b544c-b9jrp\" (UID: \"bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp" Apr 20 07:50:49.380930 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:49.380908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp8ck\" (UniqueName: \"kubernetes.io/projected/bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b-kube-api-access-gp8ck\") pod \"klusterlet-addon-workmgr-9987b544c-b9jrp\" (UID: \"bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp" Apr 20 07:50:49.481850 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:49.481817 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b-tmp\") pod \"klusterlet-addon-workmgr-9987b544c-b9jrp\" (UID: \"bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp" Apr 20 07:50:49.482043 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:49.481886 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gp8ck\" (UniqueName: \"kubernetes.io/projected/bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b-kube-api-access-gp8ck\") pod \"klusterlet-addon-workmgr-9987b544c-b9jrp\" (UID: \"bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp" Apr 20 07:50:49.482043 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:49.481914 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b-klusterlet-config\") pod \"klusterlet-addon-workmgr-9987b544c-b9jrp\" (UID: \"bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp" Apr 20 07:50:49.482254 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:49.482231 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b-tmp\") pod \"klusterlet-addon-workmgr-9987b544c-b9jrp\" (UID: \"bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp" Apr 20 07:50:49.485698 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:49.485672 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b-klusterlet-config\") pod \"klusterlet-addon-workmgr-9987b544c-b9jrp\" (UID: \"bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp" Apr 20 07:50:49.489333 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:49.489311 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp8ck\" (UniqueName: \"kubernetes.io/projected/bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b-kube-api-access-gp8ck\") pod \"klusterlet-addon-workmgr-9987b544c-b9jrp\" (UID: \"bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp" Apr 20 07:50:49.683240 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:49.683205 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp" Apr 20 07:50:49.806508 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:49.806471 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp"] Apr 20 07:50:49.809456 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:50:49.809424 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfa9d59b_d0a7_43f4_aa1a_cf1b4a87ee7b.slice/crio-7ef38ab448e88d03f5bdffa4c75c721e85df4613452c21511b012cf461b64404 WatchSource:0}: Error finding container 7ef38ab448e88d03f5bdffa4c75c721e85df4613452c21511b012cf461b64404: Status 404 returned error can't find the container with id 7ef38ab448e88d03f5bdffa4c75c721e85df4613452c21511b012cf461b64404 Apr 20 07:50:50.505270 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:50.505218 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp" event={"ID":"bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b","Type":"ContainerStarted","Data":"7ef38ab448e88d03f5bdffa4c75c721e85df4613452c21511b012cf461b64404"} Apr 20 07:50:52.505066 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:52.505024 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:50:52.505635 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:52.505097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-smjd7\" (UID: \"2e4b6f0f-88ff-49a7-ada1-7af9515863da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-smjd7" Apr 20 07:50:52.505635 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:52.505139 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert\") pod \"ingress-canary-mqbmm\" (UID: \"5eaa06ed-8495-4d3f-a0e2-d67bb19b8695\") " pod="openshift-ingress-canary/ingress-canary-mqbmm" Apr 20 07:50:52.505635 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:52.505207 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 07:50:52.505635 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:52.505234 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bd5554bd9-jrm2x: secret "image-registry-tls" not found Apr 20 07:50:52.505635 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:52.505248 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:50:52.505635 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:52.505255 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 07:50:52.505635 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:52.505304 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls podName:397df66e-7585-4bc3-aaa7-dcdc37635fc5 nodeName:}" failed. No retries permitted until 2026-04-20 07:51:08.50528115 +0000 UTC m=+65.678608575 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls") pod "image-registry-7bd5554bd9-jrm2x" (UID: "397df66e-7585-4bc3-aaa7-dcdc37635fc5") : secret "image-registry-tls" not found Apr 20 07:50:52.505635 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:52.505324 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert podName:5eaa06ed-8495-4d3f-a0e2-d67bb19b8695 nodeName:}" failed. No retries permitted until 2026-04-20 07:51:08.505313187 +0000 UTC m=+65.678640598 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert") pod "ingress-canary-mqbmm" (UID: "5eaa06ed-8495-4d3f-a0e2-d67bb19b8695") : secret "canary-serving-cert" not found Apr 20 07:50:52.505635 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:52.505339 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert podName:2e4b6f0f-88ff-49a7-ada1-7af9515863da nodeName:}" failed. No retries permitted until 2026-04-20 07:51:08.505330358 +0000 UTC m=+65.678657770 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-smjd7" (UID: "2e4b6f0f-88ff-49a7-ada1-7af9515863da") : secret "networking-console-plugin-cert" not found Apr 20 07:50:52.605728 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:52.605693 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls\") pod \"dns-default-jp28h\" (UID: \"eccac7f6-7962-45d6-9141-b02deb90631f\") " pod="openshift-dns/dns-default-jp28h" Apr 20 07:50:52.605894 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:52.605825 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:50:52.605894 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:50:52.605878 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls podName:eccac7f6-7962-45d6-9141-b02deb90631f nodeName:}" failed. No retries permitted until 2026-04-20 07:51:08.605863832 +0000 UTC m=+65.779191239 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls") pod "dns-default-jp28h" (UID: "eccac7f6-7962-45d6-9141-b02deb90631f") : secret "dns-default-metrics-tls" not found Apr 20 07:50:54.513980 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:54.513945 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp" event={"ID":"bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b","Type":"ContainerStarted","Data":"7d68e609aff8ab7002b70489fecb53b3cd813001462cc1d9c9469ed78477719e"} Apr 20 07:50:54.514375 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:54.514151 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp" Apr 20 07:50:54.515808 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:54.515784 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp" Apr 20 07:50:54.528625 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:50:54.528568 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp" podStartSLOduration=1.583600093 podStartE2EDuration="5.528556198s" podCreationTimestamp="2026-04-20 07:50:49 +0000 UTC" firstStartedPulling="2026-04-20 07:50:49.81127874 +0000 UTC m=+46.984606151" lastFinishedPulling="2026-04-20 07:50:53.75623485 +0000 UTC m=+50.929562256" observedRunningTime="2026-04-20 07:50:54.527822518 +0000 UTC m=+51.701149943" watchObservedRunningTime="2026-04-20 07:50:54.528556198 +0000 UTC m=+51.701883626" Apr 20 07:51:02.469404 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:02.469369 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-crn8d" Apr 20 07:51:08.531116 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:08.531079 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:51:08.531568 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:08.531137 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-smjd7\" (UID: \"2e4b6f0f-88ff-49a7-ada1-7af9515863da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-smjd7" Apr 20 07:51:08.531568 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:08.531174 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert\") pod \"ingress-canary-mqbmm\" (UID: \"5eaa06ed-8495-4d3f-a0e2-d67bb19b8695\") " pod="openshift-ingress-canary/ingress-canary-mqbmm" Apr 20 07:51:08.531568 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:51:08.531226 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 07:51:08.531568 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:51:08.531244 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bd5554bd9-jrm2x: secret "image-registry-tls" not found Apr 20 07:51:08.531568 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:51:08.531271 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:51:08.531568 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:51:08.531289 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 07:51:08.531568 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:51:08.531302 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls podName:397df66e-7585-4bc3-aaa7-dcdc37635fc5 nodeName:}" failed. No retries permitted until 2026-04-20 07:51:40.531287848 +0000 UTC m=+97.704615256 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls") pod "image-registry-7bd5554bd9-jrm2x" (UID: "397df66e-7585-4bc3-aaa7-dcdc37635fc5") : secret "image-registry-tls" not found Apr 20 07:51:08.531568 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:51:08.531330 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert podName:5eaa06ed-8495-4d3f-a0e2-d67bb19b8695 nodeName:}" failed. No retries permitted until 2026-04-20 07:51:40.531313955 +0000 UTC m=+97.704641383 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert") pod "ingress-canary-mqbmm" (UID: "5eaa06ed-8495-4d3f-a0e2-d67bb19b8695") : secret "canary-serving-cert" not found Apr 20 07:51:08.531568 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:51:08.531348 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert podName:2e4b6f0f-88ff-49a7-ada1-7af9515863da nodeName:}" failed. No retries permitted until 2026-04-20 07:51:40.531338453 +0000 UTC m=+97.704665861 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-smjd7" (UID: "2e4b6f0f-88ff-49a7-ada1-7af9515863da") : secret "networking-console-plugin-cert" not found Apr 20 07:51:08.632243 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:08.632187 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls\") pod \"dns-default-jp28h\" (UID: \"eccac7f6-7962-45d6-9141-b02deb90631f\") " pod="openshift-dns/dns-default-jp28h" Apr 20 07:51:08.632415 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:51:08.632331 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:51:08.632415 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:51:08.632407 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls podName:eccac7f6-7962-45d6-9141-b02deb90631f nodeName:}" failed. No retries permitted until 2026-04-20 07:51:40.632392652 +0000 UTC m=+97.805720059 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls") pod "dns-default-jp28h" (UID: "eccac7f6-7962-45d6-9141-b02deb90631f") : secret "dns-default-metrics-tls" not found Apr 20 07:51:09.036089 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:09.036051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs\") pod \"network-metrics-daemon-7qpdh\" (UID: \"46b8deca-b33b-45e1-9131-b47fde192a78\") " pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:51:09.039277 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:09.039251 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 07:51:09.046686 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:51:09.046660 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 07:51:09.046807 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:51:09.046729 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs podName:46b8deca-b33b-45e1-9131-b47fde192a78 nodeName:}" failed. No retries permitted until 2026-04-20 07:52:13.046713892 +0000 UTC m=+130.220041298 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs") pod "network-metrics-daemon-7qpdh" (UID: "46b8deca-b33b-45e1-9131-b47fde192a78") : secret "metrics-daemon-secret" not found Apr 20 07:51:09.238290 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:09.238258 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nc7c\" (UniqueName: \"kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c\") pod \"network-check-target-sfl6t\" (UID: \"ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06\") " pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:51:09.241289 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:09.241264 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 07:51:09.251548 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:09.251523 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 07:51:09.261925 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:09.261896 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nc7c\" (UniqueName: \"kubernetes.io/projected/ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06-kube-api-access-8nc7c\") pod \"network-check-target-sfl6t\" (UID: \"ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06\") " pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:51:09.498777 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:09.498748 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x6tc5\"" Apr 20 07:51:09.506943 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:09.506925 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:51:09.618124 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:09.618006 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sfl6t"] Apr 20 07:51:10.545262 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:10.545229 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sfl6t" event={"ID":"ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06","Type":"ContainerStarted","Data":"8e76dc1f6715422ba8d4db33fb87a2df6b78c413fbf36db718f391d99bce4673"} Apr 20 07:51:13.552558 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:13.552517 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sfl6t" event={"ID":"ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06","Type":"ContainerStarted","Data":"7c80f920daf809e46311a166bf478e7ae57d58c1691d87ba349e9001319ef629"} Apr 20 07:51:13.552977 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:13.552745 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:51:13.567464 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:13.567414 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-sfl6t" podStartSLOduration=67.476986831 podStartE2EDuration="1m10.567400329s" podCreationTimestamp="2026-04-20 07:50:03 +0000 UTC" firstStartedPulling="2026-04-20 07:51:09.623098369 +0000 UTC m=+66.796425776" lastFinishedPulling="2026-04-20 07:51:12.713511865 +0000 UTC m=+69.886839274" observedRunningTime="2026-04-20 07:51:13.565780635 +0000 UTC m=+70.739108064" watchObservedRunningTime="2026-04-20 07:51:13.567400329 +0000 UTC m=+70.740727758" Apr 20 07:51:40.586066 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:40.586028 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:51:40.586520 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:40.586083 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-smjd7\" (UID: \"2e4b6f0f-88ff-49a7-ada1-7af9515863da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-smjd7" Apr 20 07:51:40.586520 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:40.586107 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert\") pod \"ingress-canary-mqbmm\" (UID: \"5eaa06ed-8495-4d3f-a0e2-d67bb19b8695\") " pod="openshift-ingress-canary/ingress-canary-mqbmm" Apr 20 07:51:40.586520 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:51:40.586169 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 07:51:40.586520 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:51:40.586189 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bd5554bd9-jrm2x: secret "image-registry-tls" not found Apr 20 07:51:40.586520 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:51:40.586192 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:51:40.586520 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:51:40.586237 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 07:51:40.586520 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:51:40.586245 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert podName:5eaa06ed-8495-4d3f-a0e2-d67bb19b8695 nodeName:}" failed. No retries permitted until 2026-04-20 07:52:44.586228803 +0000 UTC m=+161.759556215 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert") pod "ingress-canary-mqbmm" (UID: "5eaa06ed-8495-4d3f-a0e2-d67bb19b8695") : secret "canary-serving-cert" not found Apr 20 07:51:40.586520 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:51:40.586308 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls podName:397df66e-7585-4bc3-aaa7-dcdc37635fc5 nodeName:}" failed. No retries permitted until 2026-04-20 07:52:44.586292752 +0000 UTC m=+161.759620159 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls") pod "image-registry-7bd5554bd9-jrm2x" (UID: "397df66e-7585-4bc3-aaa7-dcdc37635fc5") : secret "image-registry-tls" not found Apr 20 07:51:40.586520 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:51:40.586325 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert podName:2e4b6f0f-88ff-49a7-ada1-7af9515863da nodeName:}" failed. No retries permitted until 2026-04-20 07:52:44.586318711 +0000 UTC m=+161.759646118 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-smjd7" (UID: "2e4b6f0f-88ff-49a7-ada1-7af9515863da") : secret "networking-console-plugin-cert" not found Apr 20 07:51:40.687247 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:40.687222 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls\") pod \"dns-default-jp28h\" (UID: \"eccac7f6-7962-45d6-9141-b02deb90631f\") " pod="openshift-dns/dns-default-jp28h" Apr 20 07:51:40.687393 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:51:40.687339 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:51:40.687431 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:51:40.687396 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls podName:eccac7f6-7962-45d6-9141-b02deb90631f nodeName:}" failed. No retries permitted until 2026-04-20 07:52:44.68737959 +0000 UTC m=+161.860707003 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls") pod "dns-default-jp28h" (UID: "eccac7f6-7962-45d6-9141-b02deb90631f") : secret "dns-default-metrics-tls" not found Apr 20 07:51:44.557268 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:51:44.557238 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-sfl6t" Apr 20 07:52:13.129718 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:13.129678 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs\") pod \"network-metrics-daemon-7qpdh\" (UID: \"46b8deca-b33b-45e1-9131-b47fde192a78\") " pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:52:13.130187 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:52:13.129791 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 07:52:13.130187 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:52:13.129845 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs podName:46b8deca-b33b-45e1-9131-b47fde192a78 nodeName:}" failed. No retries permitted until 2026-04-20 07:54:15.129828986 +0000 UTC m=+252.303156392 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs") pod "network-metrics-daemon-7qpdh" (UID: "46b8deca-b33b-45e1-9131-b47fde192a78") : secret "metrics-daemon-secret" not found Apr 20 07:52:16.320268 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:16.320237 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-z54dw_14afe6ef-f671-4736-ba3f-ac6236c30291/dns-node-resolver/0.log" Apr 20 07:52:17.119197 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:17.119170 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-n9944_b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39/node-ca/0.log" Apr 20 07:52:20.129596 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:20.129557 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ztlpb"] Apr 20 07:52:20.132419 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:20.132402 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ztlpb" Apr 20 07:52:20.134778 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:20.134753 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 07:52:20.134903 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:20.134888 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 07:52:20.134953 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:20.134910 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 07:52:20.136083 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:20.136062 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:52:20.136175 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:20.136085 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-qc4cm\"" Apr 20 07:52:20.144967 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:20.142577 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ztlpb"] Apr 20 07:52:20.183044 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:20.183015 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e9b4311-a667-4cd6-bc63-7df6eb349627-config\") pod \"service-ca-operator-d6fc45fc5-ztlpb\" (UID: \"1e9b4311-a667-4cd6-bc63-7df6eb349627\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ztlpb" Apr 20 07:52:20.183044 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:20.183061 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e9b4311-a667-4cd6-bc63-7df6eb349627-serving-cert\") pod \"service-ca-operator-d6fc45fc5-ztlpb\" (UID: \"1e9b4311-a667-4cd6-bc63-7df6eb349627\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ztlpb" Apr 20 07:52:20.183226 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:20.183122 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6r9j\" (UniqueName: \"kubernetes.io/projected/1e9b4311-a667-4cd6-bc63-7df6eb349627-kube-api-access-t6r9j\") pod \"service-ca-operator-d6fc45fc5-ztlpb\" (UID: \"1e9b4311-a667-4cd6-bc63-7df6eb349627\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ztlpb" Apr 20 07:52:20.283595 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:20.283562 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e9b4311-a667-4cd6-bc63-7df6eb349627-serving-cert\") pod \"service-ca-operator-d6fc45fc5-ztlpb\" (UID: \"1e9b4311-a667-4cd6-bc63-7df6eb349627\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ztlpb" Apr 20 07:52:20.283733 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:20.283602 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6r9j\" (UniqueName: \"kubernetes.io/projected/1e9b4311-a667-4cd6-bc63-7df6eb349627-kube-api-access-t6r9j\") pod \"service-ca-operator-d6fc45fc5-ztlpb\" (UID: \"1e9b4311-a667-4cd6-bc63-7df6eb349627\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ztlpb" Apr 20 07:52:20.283733 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:20.283701 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e9b4311-a667-4cd6-bc63-7df6eb349627-config\") pod \"service-ca-operator-d6fc45fc5-ztlpb\" (UID: \"1e9b4311-a667-4cd6-bc63-7df6eb349627\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ztlpb" Apr 20 07:52:20.284210 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:20.284189 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e9b4311-a667-4cd6-bc63-7df6eb349627-config\") pod \"service-ca-operator-d6fc45fc5-ztlpb\" (UID: \"1e9b4311-a667-4cd6-bc63-7df6eb349627\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ztlpb" Apr 20 07:52:20.285813 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:20.285790 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e9b4311-a667-4cd6-bc63-7df6eb349627-serving-cert\") pod \"service-ca-operator-d6fc45fc5-ztlpb\" (UID: \"1e9b4311-a667-4cd6-bc63-7df6eb349627\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ztlpb" Apr 20 07:52:20.291158 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:20.291139 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6r9j\" (UniqueName: \"kubernetes.io/projected/1e9b4311-a667-4cd6-bc63-7df6eb349627-kube-api-access-t6r9j\") pod \"service-ca-operator-d6fc45fc5-ztlpb\" (UID: \"1e9b4311-a667-4cd6-bc63-7df6eb349627\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ztlpb" Apr 20 07:52:20.442393 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:20.442315 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ztlpb" Apr 20 07:52:20.553331 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:20.553298 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ztlpb"] Apr 20 07:52:20.556162 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:52:20.556128 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e9b4311_a667_4cd6_bc63_7df6eb349627.slice/crio-0c887e831f08c6089d3dc11ec24dad40262e1e602fbb4bb46248d77468066970 WatchSource:0}: Error finding container 0c887e831f08c6089d3dc11ec24dad40262e1e602fbb4bb46248d77468066970: Status 404 returned error can't find the container with id 0c887e831f08c6089d3dc11ec24dad40262e1e602fbb4bb46248d77468066970 Apr 20 07:52:20.685035 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:20.685000 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ztlpb" event={"ID":"1e9b4311-a667-4cd6-bc63-7df6eb349627","Type":"ContainerStarted","Data":"0c887e831f08c6089d3dc11ec24dad40262e1e602fbb4bb46248d77468066970"} Apr 20 07:52:23.692250 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:23.692214 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ztlpb" event={"ID":"1e9b4311-a667-4cd6-bc63-7df6eb349627","Type":"ContainerStarted","Data":"07ed189a190248310f7dc3ebe86a783ac3aa4d6f618ae1016e939ad6888ad319"} Apr 20 07:52:23.707047 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:23.706995 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ztlpb" podStartSLOduration=1.407731059 podStartE2EDuration="3.70698096s" podCreationTimestamp="2026-04-20 07:52:20 +0000 UTC" firstStartedPulling="2026-04-20 07:52:20.557921145 +0000 UTC m=+137.731248556" lastFinishedPulling="2026-04-20 07:52:22.857171047 +0000 UTC m=+140.030498457" observedRunningTime="2026-04-20 07:52:23.70602386 +0000 UTC m=+140.879351289" watchObservedRunningTime="2026-04-20 07:52:23.70698096 +0000 UTC m=+140.880308416" Apr 20 07:52:24.669762 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:24.669725 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-gjmnh"] Apr 20 07:52:24.672841 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:24.672821 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gjmnh" Apr 20 07:52:24.675570 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:24.675547 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 07:52:24.675690 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:24.675640 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 07:52:24.676662 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:24.676646 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-qmzk9\"" Apr 20 07:52:24.679927 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:24.679909 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-gjmnh"] Apr 20 07:52:24.820395 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:24.820357 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx8rg\" (UniqueName: \"kubernetes.io/projected/b167b4a9-fb74-441f-bdfc-f8c71416c3ff-kube-api-access-fx8rg\") pod \"migrator-74bb7799d9-gjmnh\" (UID: \"b167b4a9-fb74-441f-bdfc-f8c71416c3ff\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gjmnh" Apr 20 07:52:24.921497 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:24.921404 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fx8rg\" (UniqueName: \"kubernetes.io/projected/b167b4a9-fb74-441f-bdfc-f8c71416c3ff-kube-api-access-fx8rg\") pod \"migrator-74bb7799d9-gjmnh\" (UID: \"b167b4a9-fb74-441f-bdfc-f8c71416c3ff\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gjmnh" Apr 20 07:52:24.929511 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:24.929476 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx8rg\" (UniqueName: \"kubernetes.io/projected/b167b4a9-fb74-441f-bdfc-f8c71416c3ff-kube-api-access-fx8rg\") pod \"migrator-74bb7799d9-gjmnh\" (UID: \"b167b4a9-fb74-441f-bdfc-f8c71416c3ff\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gjmnh" Apr 20 07:52:24.981335 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:24.981304 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gjmnh" Apr 20 07:52:25.091587 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:25.091555 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-gjmnh"] Apr 20 07:52:25.095313 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:52:25.095280 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb167b4a9_fb74_441f_bdfc_f8c71416c3ff.slice/crio-672d18e14f83aa5f2025e58b08dff40d309223a51832facf78bdfb808f7c07b2 WatchSource:0}: Error finding container 672d18e14f83aa5f2025e58b08dff40d309223a51832facf78bdfb808f7c07b2: Status 404 returned error can't find the container with id 672d18e14f83aa5f2025e58b08dff40d309223a51832facf78bdfb808f7c07b2 Apr 20 07:52:25.697552 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:25.697520 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gjmnh" event={"ID":"b167b4a9-fb74-441f-bdfc-f8c71416c3ff","Type":"ContainerStarted","Data":"672d18e14f83aa5f2025e58b08dff40d309223a51832facf78bdfb808f7c07b2"} Apr 20 07:52:26.476937 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.476895 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-m86n9"] Apr 20 07:52:26.479923 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.479901 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-m86n9" Apr 20 07:52:26.482718 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.482700 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 07:52:26.482718 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.482713 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 07:52:26.482887 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.482870 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 07:52:26.483913 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.483900 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 07:52:26.484186 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.484171 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bh7nh\"" Apr 20 07:52:26.488578 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.488557 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-m86n9"] Apr 20 07:52:26.535597 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.535523 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e6c7b136-0002-46ca-af48-03fceb87587b-signing-cabundle\") pod \"service-ca-865cb79987-m86n9\" (UID: \"e6c7b136-0002-46ca-af48-03fceb87587b\") " pod="openshift-service-ca/service-ca-865cb79987-m86n9" Apr 20 07:52:26.535597 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.535578 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq5xf\" (UniqueName: \"kubernetes.io/projected/e6c7b136-0002-46ca-af48-03fceb87587b-kube-api-access-xq5xf\") pod \"service-ca-865cb79987-m86n9\" (UID: \"e6c7b136-0002-46ca-af48-03fceb87587b\") " pod="openshift-service-ca/service-ca-865cb79987-m86n9" Apr 20 07:52:26.535779 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.535679 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e6c7b136-0002-46ca-af48-03fceb87587b-signing-key\") pod \"service-ca-865cb79987-m86n9\" (UID: \"e6c7b136-0002-46ca-af48-03fceb87587b\") " pod="openshift-service-ca/service-ca-865cb79987-m86n9" Apr 20 07:52:26.636170 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.636137 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e6c7b136-0002-46ca-af48-03fceb87587b-signing-cabundle\") pod \"service-ca-865cb79987-m86n9\" (UID: \"e6c7b136-0002-46ca-af48-03fceb87587b\") " pod="openshift-service-ca/service-ca-865cb79987-m86n9" Apr 20 07:52:26.636318 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.636190 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xq5xf\" (UniqueName: \"kubernetes.io/projected/e6c7b136-0002-46ca-af48-03fceb87587b-kube-api-access-xq5xf\") pod \"service-ca-865cb79987-m86n9\" (UID: \"e6c7b136-0002-46ca-af48-03fceb87587b\") " pod="openshift-service-ca/service-ca-865cb79987-m86n9" Apr 20 07:52:26.636318 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.636247 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e6c7b136-0002-46ca-af48-03fceb87587b-signing-key\") pod \"service-ca-865cb79987-m86n9\" (UID: \"e6c7b136-0002-46ca-af48-03fceb87587b\") " pod="openshift-service-ca/service-ca-865cb79987-m86n9" Apr 20 07:52:26.636788 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.636767 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e6c7b136-0002-46ca-af48-03fceb87587b-signing-cabundle\") pod \"service-ca-865cb79987-m86n9\" (UID: \"e6c7b136-0002-46ca-af48-03fceb87587b\") " pod="openshift-service-ca/service-ca-865cb79987-m86n9" Apr 20 07:52:26.638681 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.638652 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e6c7b136-0002-46ca-af48-03fceb87587b-signing-key\") pod \"service-ca-865cb79987-m86n9\" (UID: \"e6c7b136-0002-46ca-af48-03fceb87587b\") " pod="openshift-service-ca/service-ca-865cb79987-m86n9" Apr 20 07:52:26.644926 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.644901 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq5xf\" (UniqueName: \"kubernetes.io/projected/e6c7b136-0002-46ca-af48-03fceb87587b-kube-api-access-xq5xf\") pod \"service-ca-865cb79987-m86n9\" (UID: \"e6c7b136-0002-46ca-af48-03fceb87587b\") " pod="openshift-service-ca/service-ca-865cb79987-m86n9" Apr 20 07:52:26.701791 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.701758 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gjmnh" event={"ID":"b167b4a9-fb74-441f-bdfc-f8c71416c3ff","Type":"ContainerStarted","Data":"dda052f46886756c9ea489f0f80e1e4de5b344d9695df4762267eacb2e936485"} Apr 20 07:52:26.701917 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.701798 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gjmnh" event={"ID":"b167b4a9-fb74-441f-bdfc-f8c71416c3ff","Type":"ContainerStarted","Data":"572f9c05f3dc0364663a29e5121fa55decb3056a6b361cf965a7a04b0c2eb44f"} Apr 20 07:52:26.719697 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.719652 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gjmnh" podStartSLOduration=1.5682821150000001 podStartE2EDuration="2.719638272s" podCreationTimestamp="2026-04-20 07:52:24 +0000 UTC" firstStartedPulling="2026-04-20 07:52:25.097083238 +0000 UTC m=+142.270410645" lastFinishedPulling="2026-04-20 07:52:26.248439385 +0000 UTC m=+143.421766802" observedRunningTime="2026-04-20 07:52:26.719080107 +0000 UTC m=+143.892407537" watchObservedRunningTime="2026-04-20 07:52:26.719638272 +0000 UTC m=+143.892965709" Apr 20 07:52:26.788604 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.788510 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-m86n9" Apr 20 07:52:26.901090 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:26.901059 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-m86n9"] Apr 20 07:52:26.905177 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:52:26.905144 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6c7b136_0002_46ca_af48_03fceb87587b.slice/crio-8c9d53f19883d1e1b59a9c785b32df0bd6ddec202f1f8b02e0f7cbd74d6cc89d WatchSource:0}: Error finding container 8c9d53f19883d1e1b59a9c785b32df0bd6ddec202f1f8b02e0f7cbd74d6cc89d: Status 404 returned error can't find the container with id 8c9d53f19883d1e1b59a9c785b32df0bd6ddec202f1f8b02e0f7cbd74d6cc89d Apr 20 07:52:27.705767 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:27.705730 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-m86n9" event={"ID":"e6c7b136-0002-46ca-af48-03fceb87587b","Type":"ContainerStarted","Data":"daa68c3516184920a1308cf84760fa0ea055a07db2b920381b4426828b8ae91c"} Apr 20 07:52:27.705767 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:27.705766 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-m86n9" event={"ID":"e6c7b136-0002-46ca-af48-03fceb87587b","Type":"ContainerStarted","Data":"8c9d53f19883d1e1b59a9c785b32df0bd6ddec202f1f8b02e0f7cbd74d6cc89d"} Apr 20 07:52:27.722537 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:27.722483 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-m86n9" podStartSLOduration=1.722466469 podStartE2EDuration="1.722466469s" podCreationTimestamp="2026-04-20 07:52:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:52:27.720759894 +0000 UTC m=+144.894087322" watchObservedRunningTime="2026-04-20 07:52:27.722466469 +0000 UTC m=+144.895793900" Apr 20 07:52:39.685204 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:52:39.685165 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" podUID="397df66e-7585-4bc3-aaa7-dcdc37635fc5" Apr 20 07:52:39.693298 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:52:39.693265 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-smjd7" podUID="2e4b6f0f-88ff-49a7-ada1-7af9515863da" Apr 20 07:52:39.700417 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:52:39.700395 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-mqbmm" podUID="5eaa06ed-8495-4d3f-a0e2-d67bb19b8695" Apr 20 07:52:39.734169 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:39.734144 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-smjd7" Apr 20 07:52:39.734312 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:39.734147 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mqbmm" Apr 20 07:52:39.734312 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:39.734155 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:52:39.799089 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:52:39.799050 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-jp28h" podUID="eccac7f6-7962-45d6-9141-b02deb90631f" Apr 20 07:52:40.736863 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:40.736832 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jp28h" Apr 20 07:52:41.301429 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:52:41.301390 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-7qpdh" podUID="46b8deca-b33b-45e1-9131-b47fde192a78" Apr 20 07:52:44.679867 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:44.679832 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-smjd7\" (UID: \"2e4b6f0f-88ff-49a7-ada1-7af9515863da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-smjd7" Apr 20 07:52:44.680353 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:44.679881 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert\") pod \"ingress-canary-mqbmm\" (UID: \"5eaa06ed-8495-4d3f-a0e2-d67bb19b8695\") " pod="openshift-ingress-canary/ingress-canary-mqbmm" Apr 20 07:52:44.680353 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:44.679949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:52:44.682267 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:44.682235 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5eaa06ed-8495-4d3f-a0e2-d67bb19b8695-cert\") pod \"ingress-canary-mqbmm\" (UID: \"5eaa06ed-8495-4d3f-a0e2-d67bb19b8695\") " pod="openshift-ingress-canary/ingress-canary-mqbmm" Apr 20 07:52:44.682408 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:44.682385 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls\") pod \"image-registry-7bd5554bd9-jrm2x\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:52:44.682468 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:44.682419 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2e4b6f0f-88ff-49a7-ada1-7af9515863da-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-smjd7\" (UID: \"2e4b6f0f-88ff-49a7-ada1-7af9515863da\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-smjd7" Apr 20 07:52:44.780532 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:44.780490 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls\") pod \"dns-default-jp28h\" (UID: \"eccac7f6-7962-45d6-9141-b02deb90631f\") " pod="openshift-dns/dns-default-jp28h" Apr 20 07:52:44.782759 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:44.782737 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eccac7f6-7962-45d6-9141-b02deb90631f-metrics-tls\") pod \"dns-default-jp28h\" (UID: \"eccac7f6-7962-45d6-9141-b02deb90631f\") " pod="openshift-dns/dns-default-jp28h" Apr 20 07:52:44.838549 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:44.838515 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mxq77\"" Apr 20 07:52:44.838549 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:44.838533 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-68mlz\"" Apr 20 07:52:44.838740 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:44.838522 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-qlz5k\"" Apr 20 07:52:44.845056 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:44.845037 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mqbmm" Apr 20 07:52:44.845151 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:44.845063 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:52:44.845151 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:44.845134 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-smjd7" Apr 20 07:52:44.940789 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:44.940656 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dvf6t\"" Apr 20 07:52:44.948568 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:44.948396 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jp28h" Apr 20 07:52:44.990400 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:44.989105 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mqbmm"] Apr 20 07:52:44.999136 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:52:44.999105 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5eaa06ed_8495_4d3f_a0e2_d67bb19b8695.slice/crio-364916b2ac1706749b94cc932be933b5b8d069fb5f25fb87c4148453834a848f WatchSource:0}: Error finding container 364916b2ac1706749b94cc932be933b5b8d069fb5f25fb87c4148453834a848f: Status 404 returned error can't find the container with id 364916b2ac1706749b94cc932be933b5b8d069fb5f25fb87c4148453834a848f Apr 20 07:52:45.013306 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:45.013132 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-smjd7"] Apr 20 07:52:45.019225 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:52:45.019190 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e4b6f0f_88ff_49a7_ada1_7af9515863da.slice/crio-33e3a5e3e66eeaace4a1fbdb458a7b013096fa181f1609f7334f31c340210586 WatchSource:0}: Error finding container 33e3a5e3e66eeaace4a1fbdb458a7b013096fa181f1609f7334f31c340210586: Status 404 returned error can't find the container with id 33e3a5e3e66eeaace4a1fbdb458a7b013096fa181f1609f7334f31c340210586 Apr 20 07:52:45.037203 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:45.037082 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7bd5554bd9-jrm2x"] Apr 20 07:52:45.039539 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:52:45.039508 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod397df66e_7585_4bc3_aaa7_dcdc37635fc5.slice/crio-acde0e5f9c6798dded049a059d3184f1bc4c2b57731b533f9c5f3cd1261c1c59 WatchSource:0}: Error finding container acde0e5f9c6798dded049a059d3184f1bc4c2b57731b533f9c5f3cd1261c1c59: Status 404 returned error can't find the container with id acde0e5f9c6798dded049a059d3184f1bc4c2b57731b533f9c5f3cd1261c1c59 Apr 20 07:52:45.077796 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:45.077772 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jp28h"] Apr 20 07:52:45.080847 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:52:45.080786 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeccac7f6_7962_45d6_9141_b02deb90631f.slice/crio-98e3de72d274033e834ba7ecac30837a0353d3862f5b7fdc896695d204c67f55 WatchSource:0}: Error finding container 98e3de72d274033e834ba7ecac30837a0353d3862f5b7fdc896695d204c67f55: Status 404 returned error can't find the container with id 98e3de72d274033e834ba7ecac30837a0353d3862f5b7fdc896695d204c67f55 Apr 20 07:52:45.750486 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:45.750445 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-smjd7" event={"ID":"2e4b6f0f-88ff-49a7-ada1-7af9515863da","Type":"ContainerStarted","Data":"33e3a5e3e66eeaace4a1fbdb458a7b013096fa181f1609f7334f31c340210586"} Apr 20 07:52:45.751758 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:45.751722 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jp28h" event={"ID":"eccac7f6-7962-45d6-9141-b02deb90631f","Type":"ContainerStarted","Data":"98e3de72d274033e834ba7ecac30837a0353d3862f5b7fdc896695d204c67f55"} Apr 20 07:52:45.753492 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:45.753467 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" event={"ID":"397df66e-7585-4bc3-aaa7-dcdc37635fc5","Type":"ContainerStarted","Data":"a0b85fd28e0d12784c4e6e841d379b8de9325cd60ece03de81b3e77af83bea15"} Apr 20 07:52:45.753636 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:45.753497 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" event={"ID":"397df66e-7585-4bc3-aaa7-dcdc37635fc5","Type":"ContainerStarted","Data":"acde0e5f9c6798dded049a059d3184f1bc4c2b57731b533f9c5f3cd1261c1c59"} Apr 20 07:52:45.753636 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:45.753598 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:52:45.754792 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:45.754768 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mqbmm" event={"ID":"5eaa06ed-8495-4d3f-a0e2-d67bb19b8695","Type":"ContainerStarted","Data":"364916b2ac1706749b94cc932be933b5b8d069fb5f25fb87c4148453834a848f"} Apr 20 07:52:45.774134 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:45.774053 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" podStartSLOduration=162.774037667 podStartE2EDuration="2m42.774037667s" podCreationTimestamp="2026-04-20 07:50:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:52:45.771901635 +0000 UTC m=+162.945229060" watchObservedRunningTime="2026-04-20 07:52:45.774037667 +0000 UTC m=+162.947365118" Apr 20 07:52:47.119408 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.119344 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-84g2g"] Apr 20 07:52:47.122674 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.122650 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xmbv"] Apr 20 07:52:47.122820 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.122792 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-84g2g" Apr 20 07:52:47.125541 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.125516 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-x48mk\"" Apr 20 07:52:47.125692 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.125565 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-n5lkq"] Apr 20 07:52:47.125756 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.125708 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xmbv" Apr 20 07:52:47.129161 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.129132 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-tzkbd\"" Apr 20 07:52:47.131098 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.129437 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 07:52:47.131962 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.131940 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-n5lkq" Apr 20 07:52:47.133508 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.133157 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-84g2g"] Apr 20 07:52:47.134387 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.134357 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xmbv"] Apr 20 07:52:47.135680 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.135528 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 07:52:47.135680 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.135634 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-lvs8m\"" Apr 20 07:52:47.135836 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.135740 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 07:52:47.135886 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.135840 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 07:52:47.135886 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.135869 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 07:52:47.138895 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.138371 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-n5lkq"] Apr 20 07:52:47.305372 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.305338 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3a038fd0-e0d5-4ded-ab07-36ddf6d31d03-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n5lkq\" (UID: \"3a038fd0-e0d5-4ded-ab07-36ddf6d31d03\") " pod="openshift-insights/insights-runtime-extractor-n5lkq" Apr 20 07:52:47.305372 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.305381 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rznnj\" (UniqueName: \"kubernetes.io/projected/66892cf2-e8c0-4ea8-b96d-237bfbb843f4-kube-api-access-rznnj\") pod \"network-check-source-8894fc9bd-84g2g\" (UID: \"66892cf2-e8c0-4ea8-b96d-237bfbb843f4\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-84g2g" Apr 20 07:52:47.305605 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.305399 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3a038fd0-e0d5-4ded-ab07-36ddf6d31d03-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n5lkq\" (UID: \"3a038fd0-e0d5-4ded-ab07-36ddf6d31d03\") " pod="openshift-insights/insights-runtime-extractor-n5lkq" Apr 20 07:52:47.305605 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.305509 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jldz5\" (UniqueName: \"kubernetes.io/projected/3a038fd0-e0d5-4ded-ab07-36ddf6d31d03-kube-api-access-jldz5\") pod \"insights-runtime-extractor-n5lkq\" (UID: \"3a038fd0-e0d5-4ded-ab07-36ddf6d31d03\") " pod="openshift-insights/insights-runtime-extractor-n5lkq" Apr 20 07:52:47.305605 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.305550 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/27805bcb-303c-42a4-8c37-030c42c57561-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-6xmbv\" (UID: \"27805bcb-303c-42a4-8c37-030c42c57561\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xmbv" Apr 20 07:52:47.305605 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.305600 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3a038fd0-e0d5-4ded-ab07-36ddf6d31d03-crio-socket\") pod \"insights-runtime-extractor-n5lkq\" (UID: \"3a038fd0-e0d5-4ded-ab07-36ddf6d31d03\") " pod="openshift-insights/insights-runtime-extractor-n5lkq" Apr 20 07:52:47.305842 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.305647 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3a038fd0-e0d5-4ded-ab07-36ddf6d31d03-data-volume\") pod \"insights-runtime-extractor-n5lkq\" (UID: \"3a038fd0-e0d5-4ded-ab07-36ddf6d31d03\") " pod="openshift-insights/insights-runtime-extractor-n5lkq" Apr 20 07:52:47.406097 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.406066 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jldz5\" (UniqueName: \"kubernetes.io/projected/3a038fd0-e0d5-4ded-ab07-36ddf6d31d03-kube-api-access-jldz5\") pod \"insights-runtime-extractor-n5lkq\" (UID: \"3a038fd0-e0d5-4ded-ab07-36ddf6d31d03\") " pod="openshift-insights/insights-runtime-extractor-n5lkq" Apr 20 07:52:47.406249 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.406116 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/27805bcb-303c-42a4-8c37-030c42c57561-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-6xmbv\" (UID: \"27805bcb-303c-42a4-8c37-030c42c57561\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xmbv" Apr 20 07:52:47.406249 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.406150 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3a038fd0-e0d5-4ded-ab07-36ddf6d31d03-crio-socket\") pod \"insights-runtime-extractor-n5lkq\" (UID: \"3a038fd0-e0d5-4ded-ab07-36ddf6d31d03\") " pod="openshift-insights/insights-runtime-extractor-n5lkq" Apr 20 07:52:47.406249 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.406174 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3a038fd0-e0d5-4ded-ab07-36ddf6d31d03-data-volume\") pod \"insights-runtime-extractor-n5lkq\" (UID: \"3a038fd0-e0d5-4ded-ab07-36ddf6d31d03\") " pod="openshift-insights/insights-runtime-extractor-n5lkq" Apr 20 07:52:47.406249 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.406205 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3a038fd0-e0d5-4ded-ab07-36ddf6d31d03-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n5lkq\" (UID: \"3a038fd0-e0d5-4ded-ab07-36ddf6d31d03\") " pod="openshift-insights/insights-runtime-extractor-n5lkq" Apr 20 07:52:47.406249 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.406236 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rznnj\" (UniqueName: \"kubernetes.io/projected/66892cf2-e8c0-4ea8-b96d-237bfbb843f4-kube-api-access-rznnj\") pod \"network-check-source-8894fc9bd-84g2g\" (UID: \"66892cf2-e8c0-4ea8-b96d-237bfbb843f4\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-84g2g" Apr 20 07:52:47.406497 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.406259 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3a038fd0-e0d5-4ded-ab07-36ddf6d31d03-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n5lkq\" (UID: \"3a038fd0-e0d5-4ded-ab07-36ddf6d31d03\") " pod="openshift-insights/insights-runtime-extractor-n5lkq" Apr 20 07:52:47.406497 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.406284 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3a038fd0-e0d5-4ded-ab07-36ddf6d31d03-crio-socket\") pod \"insights-runtime-extractor-n5lkq\" (UID: \"3a038fd0-e0d5-4ded-ab07-36ddf6d31d03\") " pod="openshift-insights/insights-runtime-extractor-n5lkq" Apr 20 07:52:47.406636 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.406597 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3a038fd0-e0d5-4ded-ab07-36ddf6d31d03-data-volume\") pod \"insights-runtime-extractor-n5lkq\" (UID: \"3a038fd0-e0d5-4ded-ab07-36ddf6d31d03\") " pod="openshift-insights/insights-runtime-extractor-n5lkq" Apr 20 07:52:47.406833 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.406815 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3a038fd0-e0d5-4ded-ab07-36ddf6d31d03-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n5lkq\" (UID: \"3a038fd0-e0d5-4ded-ab07-36ddf6d31d03\") " pod="openshift-insights/insights-runtime-extractor-n5lkq" Apr 20 07:52:47.408712 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.408671 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/27805bcb-303c-42a4-8c37-030c42c57561-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-6xmbv\" (UID: \"27805bcb-303c-42a4-8c37-030c42c57561\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xmbv" Apr 20 07:52:47.408886 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.408870 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3a038fd0-e0d5-4ded-ab07-36ddf6d31d03-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n5lkq\" (UID: \"3a038fd0-e0d5-4ded-ab07-36ddf6d31d03\") " pod="openshift-insights/insights-runtime-extractor-n5lkq" Apr 20 07:52:47.417487 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.417463 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jldz5\" (UniqueName: \"kubernetes.io/projected/3a038fd0-e0d5-4ded-ab07-36ddf6d31d03-kube-api-access-jldz5\") pod \"insights-runtime-extractor-n5lkq\" (UID: \"3a038fd0-e0d5-4ded-ab07-36ddf6d31d03\") " pod="openshift-insights/insights-runtime-extractor-n5lkq" Apr 20 07:52:47.417759 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.417741 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rznnj\" (UniqueName: \"kubernetes.io/projected/66892cf2-e8c0-4ea8-b96d-237bfbb843f4-kube-api-access-rznnj\") pod \"network-check-source-8894fc9bd-84g2g\" (UID: \"66892cf2-e8c0-4ea8-b96d-237bfbb843f4\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-84g2g" Apr 20 07:52:47.439585 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.439536 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-84g2g" Apr 20 07:52:47.448270 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.448241 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xmbv" Apr 20 07:52:47.479765 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.479733 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-n5lkq" Apr 20 07:52:47.567225 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.567194 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-84g2g"] Apr 20 07:52:47.569872 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:52:47.569847 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66892cf2_e8c0_4ea8_b96d_237bfbb843f4.slice/crio-14d0c7d23a85db0c053cc59c42d762283efce3b9033c4862f263a50833525a89 WatchSource:0}: Error finding container 14d0c7d23a85db0c053cc59c42d762283efce3b9033c4862f263a50833525a89: Status 404 returned error can't find the container with id 14d0c7d23a85db0c053cc59c42d762283efce3b9033c4862f263a50833525a89 Apr 20 07:52:47.594540 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.594505 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xmbv"] Apr 20 07:52:47.606609 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:52:47.606586 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27805bcb_303c_42a4_8c37_030c42c57561.slice/crio-b21b401437395bd3941a0725d587cb4fc03c8f91eba05f4230c8eab1b494f9b7 WatchSource:0}: Error finding container b21b401437395bd3941a0725d587cb4fc03c8f91eba05f4230c8eab1b494f9b7: Status 404 returned error can't find the container with id b21b401437395bd3941a0725d587cb4fc03c8f91eba05f4230c8eab1b494f9b7 Apr 20 07:52:47.619870 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.619847 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-n5lkq"] Apr 20 07:52:47.627501 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:52:47.627477 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a038fd0_e0d5_4ded_ab07_36ddf6d31d03.slice/crio-cb9d94df06755cacb9fa3c20988362d373533a1556b353b5a3815a4ff466faec WatchSource:0}: Error finding container cb9d94df06755cacb9fa3c20988362d373533a1556b353b5a3815a4ff466faec: Status 404 returned error can't find the container with id cb9d94df06755cacb9fa3c20988362d373533a1556b353b5a3815a4ff466faec Apr 20 07:52:47.762625 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.762590 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jp28h" event={"ID":"eccac7f6-7962-45d6-9141-b02deb90631f","Type":"ContainerStarted","Data":"783fbb743693e55a85175cfd36e094c781f451b55b90ffbbbbeabb3ef0138f82"} Apr 20 07:52:47.762818 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.762638 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jp28h" event={"ID":"eccac7f6-7962-45d6-9141-b02deb90631f","Type":"ContainerStarted","Data":"a809a9240f591f80ced5fe798cbd0a98bce775c65546a46cddffcb679e438aaf"} Apr 20 07:52:47.762818 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.762768 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-jp28h" Apr 20 07:52:47.764009 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.763986 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mqbmm" event={"ID":"5eaa06ed-8495-4d3f-a0e2-d67bb19b8695","Type":"ContainerStarted","Data":"a01ac2fab2aefcd2bb3c183fbe8e302175dbcfe019c72adec61bf7b5a19ae2a1"} Apr 20 07:52:47.765276 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.765251 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-84g2g" event={"ID":"66892cf2-e8c0-4ea8-b96d-237bfbb843f4","Type":"ContainerStarted","Data":"96972bb83434f9634b326bea35974f3f2c9e170e31c577354f1ca74b0e3908da"} Apr 20 07:52:47.765372 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.765281 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-84g2g" event={"ID":"66892cf2-e8c0-4ea8-b96d-237bfbb843f4","Type":"ContainerStarted","Data":"14d0c7d23a85db0c053cc59c42d762283efce3b9033c4862f263a50833525a89"} Apr 20 07:52:47.766630 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.766593 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-smjd7" event={"ID":"2e4b6f0f-88ff-49a7-ada1-7af9515863da","Type":"ContainerStarted","Data":"24ffa0edae424ea8d253a504613eff8e76cd3b3b3321b9140f81cc709d6b8892"} Apr 20 07:52:47.767872 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.767855 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n5lkq" event={"ID":"3a038fd0-e0d5-4ded-ab07-36ddf6d31d03","Type":"ContainerStarted","Data":"cb6cb54c5fc055b071492a3e5ddd314c8ca5458f6ce563ab37b6c2f3ecb6627e"} Apr 20 07:52:47.767978 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.767877 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n5lkq" event={"ID":"3a038fd0-e0d5-4ded-ab07-36ddf6d31d03","Type":"ContainerStarted","Data":"cb9d94df06755cacb9fa3c20988362d373533a1556b353b5a3815a4ff466faec"} Apr 20 07:52:47.768775 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.768759 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xmbv" event={"ID":"27805bcb-303c-42a4-8c37-030c42c57561","Type":"ContainerStarted","Data":"b21b401437395bd3941a0725d587cb4fc03c8f91eba05f4230c8eab1b494f9b7"} Apr 20 07:52:47.778096 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.778055 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jp28h" podStartSLOduration=129.730914125 podStartE2EDuration="2m11.778042527s" podCreationTimestamp="2026-04-20 07:50:36 +0000 UTC" firstStartedPulling="2026-04-20 07:52:45.083198066 +0000 UTC m=+162.256525478" lastFinishedPulling="2026-04-20 07:52:47.130326456 +0000 UTC m=+164.303653880" observedRunningTime="2026-04-20 07:52:47.777301108 +0000 UTC m=+164.950628538" watchObservedRunningTime="2026-04-20 07:52:47.778042527 +0000 UTC m=+164.951369950" Apr 20 07:52:47.790033 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.789995 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-84g2g" podStartSLOduration=0.789985414 podStartE2EDuration="789.985414ms" podCreationTimestamp="2026-04-20 07:52:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:52:47.789509633 +0000 UTC m=+164.962837061" watchObservedRunningTime="2026-04-20 07:52:47.789985414 +0000 UTC m=+164.963312845" Apr 20 07:52:47.805574 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.805520 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-smjd7" podStartSLOduration=155.637927742 podStartE2EDuration="2m37.805503875s" podCreationTimestamp="2026-04-20 07:50:10 +0000 UTC" firstStartedPulling="2026-04-20 07:52:45.021881116 +0000 UTC m=+162.195208523" lastFinishedPulling="2026-04-20 07:52:47.189457228 +0000 UTC m=+164.362784656" observedRunningTime="2026-04-20 07:52:47.805403251 +0000 UTC m=+164.978730680" watchObservedRunningTime="2026-04-20 07:52:47.805503875 +0000 UTC m=+164.978831303" Apr 20 07:52:47.824478 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:47.824426 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mqbmm" podStartSLOduration=129.694587252 podStartE2EDuration="2m11.824414305s" podCreationTimestamp="2026-04-20 07:50:36 +0000 UTC" firstStartedPulling="2026-04-20 07:52:45.001282241 +0000 UTC m=+162.174609654" lastFinishedPulling="2026-04-20 07:52:47.131109285 +0000 UTC m=+164.304436707" observedRunningTime="2026-04-20 07:52:47.823500278 +0000 UTC m=+164.996827718" watchObservedRunningTime="2026-04-20 07:52:47.824414305 +0000 UTC m=+164.997741754" Apr 20 07:52:48.775098 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:48.775059 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n5lkq" event={"ID":"3a038fd0-e0d5-4ded-ab07-36ddf6d31d03","Type":"ContainerStarted","Data":"597a366a31cb2165cb7efae3238a07347d077322b495eebb1cd36a7c89948297"} Apr 20 07:52:48.776585 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:48.776552 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xmbv" event={"ID":"27805bcb-303c-42a4-8c37-030c42c57561","Type":"ContainerStarted","Data":"240ada401f05e6c37e60fa77f2f66885601735aab0af8f8c119a7670e4367991"} Apr 20 07:52:48.777001 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:48.776980 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xmbv" Apr 20 07:52:48.778237 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:48.778214 2575 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-57cf98b594-6xmbv container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.132.0.15:8443/healthz\": dial tcp 10.132.0.15:8443: connect: connection refused" start-of-body= Apr 20 07:52:48.778351 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:48.778257 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xmbv" podUID="27805bcb-303c-42a4-8c37-030c42c57561" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.132.0.15:8443/healthz\": dial tcp 10.132.0.15:8443: connect: connection refused" Apr 20 07:52:48.791841 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:48.791749 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xmbv" podStartSLOduration=0.719083225 podStartE2EDuration="1.791731989s" podCreationTimestamp="2026-04-20 07:52:47 +0000 UTC" firstStartedPulling="2026-04-20 07:52:47.608476187 +0000 UTC m=+164.781803594" lastFinishedPulling="2026-04-20 07:52:48.681124949 +0000 UTC m=+165.854452358" observedRunningTime="2026-04-20 07:52:48.790370305 +0000 UTC m=+165.963697735" watchObservedRunningTime="2026-04-20 07:52:48.791731989 +0000 UTC m=+165.965059418" Apr 20 07:52:49.783727 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:49.783695 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6xmbv" Apr 20 07:52:50.784086 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:50.784046 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n5lkq" event={"ID":"3a038fd0-e0d5-4ded-ab07-36ddf6d31d03","Type":"ContainerStarted","Data":"0965a08bd327129a11d9e81ba01384dc55106615a02aeee7adcdcbd6f3627446"} Apr 20 07:52:50.802789 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:50.802741 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-n5lkq" podStartSLOduration=1.400557157 podStartE2EDuration="3.802725928s" podCreationTimestamp="2026-04-20 07:52:47 +0000 UTC" firstStartedPulling="2026-04-20 07:52:47.700189626 +0000 UTC m=+164.873517032" lastFinishedPulling="2026-04-20 07:52:50.102358396 +0000 UTC m=+167.275685803" observedRunningTime="2026-04-20 07:52:50.802073723 +0000 UTC m=+167.975401152" watchObservedRunningTime="2026-04-20 07:52:50.802725928 +0000 UTC m=+167.976053357" Apr 20 07:52:54.514647 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:54.514583 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp" podUID="bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.10:8000/readyz\": dial tcp 10.132.0.10:8000: connect: connection refused" Apr 20 07:52:54.794762 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:54.794679 2575 generic.go:358] "Generic (PLEG): container finished" podID="bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b" containerID="7d68e609aff8ab7002b70489fecb53b3cd813001462cc1d9c9469ed78477719e" exitCode=1 Apr 20 07:52:54.794924 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:54.794762 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp" event={"ID":"bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b","Type":"ContainerDied","Data":"7d68e609aff8ab7002b70489fecb53b3cd813001462cc1d9c9469ed78477719e"} Apr 20 07:52:54.795147 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:54.795132 2575 scope.go:117] "RemoveContainer" containerID="7d68e609aff8ab7002b70489fecb53b3cd813001462cc1d9c9469ed78477719e" Apr 20 07:52:55.110344 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.110250 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-mt56d"] Apr 20 07:52:55.113531 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.113505 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.116602 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.116414 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 07:52:55.116602 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.116435 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 07:52:55.116602 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.116454 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 07:52:55.116602 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.116420 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-59mhk\"" Apr 20 07:52:55.117846 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.117822 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 07:52:55.117950 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.117825 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 07:52:55.117950 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.117875 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 07:52:55.267824 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.267788 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1c403e29-027b-4589-8597-f3313cb7a43d-node-exporter-accelerators-collector-config\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.267824 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.267825 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1c403e29-027b-4589-8597-f3313cb7a43d-node-exporter-tls\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.268045 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.267844 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1c403e29-027b-4589-8597-f3313cb7a43d-node-exporter-wtmp\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.268045 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.267928 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgb8s\" (UniqueName: \"kubernetes.io/projected/1c403e29-027b-4589-8597-f3313cb7a43d-kube-api-access-zgb8s\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.268045 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.267979 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c403e29-027b-4589-8597-f3313cb7a43d-sys\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.268145 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.268046 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1c403e29-027b-4589-8597-f3313cb7a43d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.268145 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.268075 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1c403e29-027b-4589-8597-f3313cb7a43d-metrics-client-ca\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.268145 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.268104 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1c403e29-027b-4589-8597-f3313cb7a43d-node-exporter-textfile\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.268145 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.268127 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1c403e29-027b-4589-8597-f3313cb7a43d-root\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.285982 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.285955 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:52:55.368554 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.368527 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1c403e29-027b-4589-8597-f3313cb7a43d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.368751 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.368567 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1c403e29-027b-4589-8597-f3313cb7a43d-metrics-client-ca\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.368751 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.368588 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1c403e29-027b-4589-8597-f3313cb7a43d-node-exporter-textfile\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.368751 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.368608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1c403e29-027b-4589-8597-f3313cb7a43d-root\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.368751 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.368689 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1c403e29-027b-4589-8597-f3313cb7a43d-root\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.368751 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.368724 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1c403e29-027b-4589-8597-f3313cb7a43d-node-exporter-accelerators-collector-config\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.368751 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.368742 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1c403e29-027b-4589-8597-f3313cb7a43d-node-exporter-tls\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.369036 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.368759 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1c403e29-027b-4589-8597-f3313cb7a43d-node-exporter-wtmp\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.369036 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.368784 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgb8s\" (UniqueName: \"kubernetes.io/projected/1c403e29-027b-4589-8597-f3313cb7a43d-kube-api-access-zgb8s\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.369036 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.368830 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c403e29-027b-4589-8597-f3313cb7a43d-sys\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.369036 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.368898 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1c403e29-027b-4589-8597-f3313cb7a43d-sys\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.369036 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.368975 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1c403e29-027b-4589-8597-f3313cb7a43d-node-exporter-textfile\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.369231 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.369075 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1c403e29-027b-4589-8597-f3313cb7a43d-node-exporter-wtmp\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.369294 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.369275 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1c403e29-027b-4589-8597-f3313cb7a43d-metrics-client-ca\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.369352 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.369337 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1c403e29-027b-4589-8597-f3313cb7a43d-node-exporter-accelerators-collector-config\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.370942 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.370920 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1c403e29-027b-4589-8597-f3313cb7a43d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.371172 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.371153 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1c403e29-027b-4589-8597-f3313cb7a43d-node-exporter-tls\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.376200 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.376181 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgb8s\" (UniqueName: \"kubernetes.io/projected/1c403e29-027b-4589-8597-f3313cb7a43d-kube-api-access-zgb8s\") pod \"node-exporter-mt56d\" (UID: \"1c403e29-027b-4589-8597-f3313cb7a43d\") " pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.424179 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.424152 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mt56d" Apr 20 07:52:55.433516 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:52:55.433488 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c403e29_027b_4589_8597_f3313cb7a43d.slice/crio-6f989ed0185e416064c9099100e6585f2f80823a6f815623f8aa1c6a2ca41528 WatchSource:0}: Error finding container 6f989ed0185e416064c9099100e6585f2f80823a6f815623f8aa1c6a2ca41528: Status 404 returned error can't find the container with id 6f989ed0185e416064c9099100e6585f2f80823a6f815623f8aa1c6a2ca41528 Apr 20 07:52:55.799577 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.799491 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mt56d" event={"ID":"1c403e29-027b-4589-8597-f3313cb7a43d","Type":"ContainerStarted","Data":"6f989ed0185e416064c9099100e6585f2f80823a6f815623f8aa1c6a2ca41528"} Apr 20 07:52:55.801331 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.801303 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp" event={"ID":"bfa9d59b-d0a7-43f4-aa1a-cf1b4a87ee7b","Type":"ContainerStarted","Data":"f1b002d67f02a1a97248b1abda0145c2672a80731a5766debccd533440cc86dd"} Apr 20 07:52:55.801629 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.801574 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp" Apr 20 07:52:55.802258 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:55.802236 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9987b544c-b9jrp" Apr 20 07:52:56.805730 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:56.805697 2575 generic.go:358] "Generic (PLEG): container finished" podID="1c403e29-027b-4589-8597-f3313cb7a43d" containerID="7d25fff230634d7dd6d82361115ba31977a9bf475e04f9cf8375f028db4dc63d" exitCode=0 Apr 20 07:52:56.806083 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:56.805789 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mt56d" event={"ID":"1c403e29-027b-4589-8597-f3313cb7a43d","Type":"ContainerDied","Data":"7d25fff230634d7dd6d82361115ba31977a9bf475e04f9cf8375f028db4dc63d"} Apr 20 07:52:57.779877 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:57.779847 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jp28h" Apr 20 07:52:57.810820 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:57.810789 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mt56d" event={"ID":"1c403e29-027b-4589-8597-f3313cb7a43d","Type":"ContainerStarted","Data":"c5958c0a4956d6dbf8a7942ef9fce9cbd18e519e7c4f761e98c948507d727afc"} Apr 20 07:52:57.810820 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:57.810826 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mt56d" event={"ID":"1c403e29-027b-4589-8597-f3313cb7a43d","Type":"ContainerStarted","Data":"24237e5c83ab76095b4d9f12c4cce88994e8be1f516d41fb23d0e03257751f59"} Apr 20 07:52:57.829342 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:57.829289 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-mt56d" podStartSLOduration=2.123968305 podStartE2EDuration="2.829272016s" podCreationTimestamp="2026-04-20 07:52:55 +0000 UTC" firstStartedPulling="2026-04-20 07:52:55.435783585 +0000 UTC m=+172.609110992" lastFinishedPulling="2026-04-20 07:52:56.141087274 +0000 UTC m=+173.314414703" observedRunningTime="2026-04-20 07:52:57.828660985 +0000 UTC m=+175.001988414" watchObservedRunningTime="2026-04-20 07:52:57.829272016 +0000 UTC m=+175.002599446" Apr 20 07:52:58.176532 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.176497 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-67dfbc55cc-xxt72"] Apr 20 07:52:58.180046 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.180024 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.182813 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.182787 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-ers8386bajtpc\"" Apr 20 07:52:58.182813 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.182794 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 07:52:58.182995 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.182840 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 07:52:58.182995 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.182867 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 07:52:58.182995 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.182795 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 07:52:58.183255 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.183232 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-mhlbc\"" Apr 20 07:52:58.183370 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.183317 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 07:52:58.188578 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.188554 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b749342e-e13a-4969-8047-1f0d69bbbcef-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.188675 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.188603 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b749342e-e13a-4969-8047-1f0d69bbbcef-secret-thanos-querier-tls\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.188675 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.188662 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j8k6\" (UniqueName: \"kubernetes.io/projected/b749342e-e13a-4969-8047-1f0d69bbbcef-kube-api-access-6j8k6\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.188764 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.188696 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b749342e-e13a-4969-8047-1f0d69bbbcef-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.188764 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.188750 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b749342e-e13a-4969-8047-1f0d69bbbcef-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.188835 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.188797 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b749342e-e13a-4969-8047-1f0d69bbbcef-metrics-client-ca\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.188835 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.188828 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b749342e-e13a-4969-8047-1f0d69bbbcef-secret-grpc-tls\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.188912 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.188865 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b749342e-e13a-4969-8047-1f0d69bbbcef-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.192089 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.192069 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-67dfbc55cc-xxt72"] Apr 20 07:52:58.289314 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.289276 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b749342e-e13a-4969-8047-1f0d69bbbcef-secret-grpc-tls\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.289512 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.289325 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b749342e-e13a-4969-8047-1f0d69bbbcef-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.289512 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.289387 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b749342e-e13a-4969-8047-1f0d69bbbcef-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.289512 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.289419 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b749342e-e13a-4969-8047-1f0d69bbbcef-secret-thanos-querier-tls\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.289512 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.289450 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6j8k6\" (UniqueName: \"kubernetes.io/projected/b749342e-e13a-4969-8047-1f0d69bbbcef-kube-api-access-6j8k6\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.289819 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.289600 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b749342e-e13a-4969-8047-1f0d69bbbcef-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.289819 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.289670 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b749342e-e13a-4969-8047-1f0d69bbbcef-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.289819 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.289742 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b749342e-e13a-4969-8047-1f0d69bbbcef-metrics-client-ca\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.291062 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.291031 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b749342e-e13a-4969-8047-1f0d69bbbcef-metrics-client-ca\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.292269 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.292238 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b749342e-e13a-4969-8047-1f0d69bbbcef-secret-grpc-tls\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.292459 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.292391 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b749342e-e13a-4969-8047-1f0d69bbbcef-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.292459 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.292406 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b749342e-e13a-4969-8047-1f0d69bbbcef-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.292652 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.292629 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b749342e-e13a-4969-8047-1f0d69bbbcef-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.292906 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.292868 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b749342e-e13a-4969-8047-1f0d69bbbcef-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.293003 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.292984 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b749342e-e13a-4969-8047-1f0d69bbbcef-secret-thanos-querier-tls\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.297235 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.297213 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j8k6\" (UniqueName: \"kubernetes.io/projected/b749342e-e13a-4969-8047-1f0d69bbbcef-kube-api-access-6j8k6\") pod \"thanos-querier-67dfbc55cc-xxt72\" (UID: \"b749342e-e13a-4969-8047-1f0d69bbbcef\") " pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.490241 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.490162 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:52:58.610779 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.610743 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-67dfbc55cc-xxt72"] Apr 20 07:52:58.614756 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:52:58.614732 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb749342e_e13a_4969_8047_1f0d69bbbcef.slice/crio-c0cea4c142b3b1c339f10cce713e3fc4c45c6d64fba0dd2ffa1b159a558f0fbe WatchSource:0}: Error finding container c0cea4c142b3b1c339f10cce713e3fc4c45c6d64fba0dd2ffa1b159a558f0fbe: Status 404 returned error can't find the container with id c0cea4c142b3b1c339f10cce713e3fc4c45c6d64fba0dd2ffa1b159a558f0fbe Apr 20 07:52:58.819301 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:52:58.819214 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" event={"ID":"b749342e-e13a-4969-8047-1f0d69bbbcef","Type":"ContainerStarted","Data":"c0cea4c142b3b1c339f10cce713e3fc4c45c6d64fba0dd2ffa1b159a558f0fbe"} Apr 20 07:53:00.826415 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:00.826332 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" event={"ID":"b749342e-e13a-4969-8047-1f0d69bbbcef","Type":"ContainerStarted","Data":"b4b4a685bee4d0edba23295dcd47fdfc8318b1325f05a1ffc3b2cc794e933de2"} Apr 20 07:53:00.826415 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:00.826367 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" event={"ID":"b749342e-e13a-4969-8047-1f0d69bbbcef","Type":"ContainerStarted","Data":"02255a50b7fa4925da22dba170179fd6a197595c73713ce15487fca841168c9d"} Apr 20 07:53:00.826415 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:00.826377 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" event={"ID":"b749342e-e13a-4969-8047-1f0d69bbbcef","Type":"ContainerStarted","Data":"8d70d7c3bc5f1bc4046f4e3c474abd5ec8eab295b3e82cebfeb98f37df07743f"} Apr 20 07:53:01.310604 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.310567 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 07:53:01.314777 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.314756 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.317459 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.317435 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-20pri24etom4j\"" Apr 20 07:53:01.317652 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.317630 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 07:53:01.317731 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.317717 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 07:53:01.317791 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.317730 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 07:53:01.317867 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.317851 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 07:53:01.318032 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.317995 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 07:53:01.318521 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.318501 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 07:53:01.318781 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.318749 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 07:53:01.318781 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.318753 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 07:53:01.318973 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.318881 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-jvrkd\"" Apr 20 07:53:01.319051 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.319011 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 07:53:01.319385 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.319269 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 07:53:01.319385 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.319278 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 07:53:01.319554 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.319530 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 07:53:01.321209 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.321189 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 07:53:01.325793 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.325755 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 07:53:01.415294 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.415264 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.415405 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.415304 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.415405 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.415333 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.415405 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.415382 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.415513 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.415430 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-config\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.415513 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.415466 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.415513 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.415487 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.415513 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.415511 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.415670 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.415532 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh9sq\" (UniqueName: \"kubernetes.io/projected/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-kube-api-access-hh9sq\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.415670 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.415585 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.415670 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.415605 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.415670 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.415640 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-web-config\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.415670 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.415660 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.415842 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.415677 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.415842 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.415692 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.415842 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.415713 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-config-out\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.415842 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.415775 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.415951 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.415845 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.517094 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.517067 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.517185 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.517116 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-config\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.517185 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.517155 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.517263 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.517187 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.517263 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.517220 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.517263 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.517243 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hh9sq\" (UniqueName: \"kubernetes.io/projected/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-kube-api-access-hh9sq\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.517361 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.517279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.517361 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.517306 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.517361 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.517329 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-web-config\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.517454 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.517356 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.517454 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.517386 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.517454 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.517412 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.517454 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.517441 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-config-out\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.517583 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.517475 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.517583 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.517531 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.517583 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.517559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.517693 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.517589 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.517693 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.517635 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.518501 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.518459 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.519113 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.519074 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.521155 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.520798 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.521155 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.520985 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.521155 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.521055 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-config\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.521155 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.521073 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.521404 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.521340 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.521404 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.521388 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.521782 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.521757 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.521934 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.521910 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.523880 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.523841 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.524174 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.524140 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.524272 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.524199 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-config-out\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.524272 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.524208 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-web-config\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.524378 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.524365 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.524561 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.524542 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.524806 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.524781 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.527054 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.527034 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh9sq\" (UniqueName: \"kubernetes.io/projected/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-kube-api-access-hh9sq\") pod \"prometheus-k8s-0\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.625809 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.625760 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:01.757883 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.757709 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 07:53:01.760542 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:53:01.760520 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ed85a0f_27c2_45c5_8dcd_c50bce2f5d39.slice/crio-75ae7783822f6ce0e115321a63afd50e128a33ef5098d585c95176b8c5850b19 WatchSource:0}: Error finding container 75ae7783822f6ce0e115321a63afd50e128a33ef5098d585c95176b8c5850b19: Status 404 returned error can't find the container with id 75ae7783822f6ce0e115321a63afd50e128a33ef5098d585c95176b8c5850b19 Apr 20 07:53:01.831775 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.831668 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" event={"ID":"b749342e-e13a-4969-8047-1f0d69bbbcef","Type":"ContainerStarted","Data":"5ff8b942723de4bb878f1ef5041210dae7129d918c3e848730ffff4d65ec3d18"} Apr 20 07:53:01.831775 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.831719 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" event={"ID":"b749342e-e13a-4969-8047-1f0d69bbbcef","Type":"ContainerStarted","Data":"9066e67160e20db9a1cc5efc1fc7aed838feef8133fe685def44b87183910b0c"} Apr 20 07:53:01.831775 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.831731 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" event={"ID":"b749342e-e13a-4969-8047-1f0d69bbbcef","Type":"ContainerStarted","Data":"88654f64d3c6d35fb47c1a0e41604cd9aba91045a2feccf6149359a982dc9a5c"} Apr 20 07:53:01.832304 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.831864 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:53:01.832878 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.832848 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39","Type":"ContainerStarted","Data":"75ae7783822f6ce0e115321a63afd50e128a33ef5098d585c95176b8c5850b19"} Apr 20 07:53:01.857110 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:01.857058 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" podStartSLOduration=1.0725282 podStartE2EDuration="3.857042196s" podCreationTimestamp="2026-04-20 07:52:58 +0000 UTC" firstStartedPulling="2026-04-20 07:52:58.616496008 +0000 UTC m=+175.789823415" lastFinishedPulling="2026-04-20 07:53:01.401010004 +0000 UTC m=+178.574337411" observedRunningTime="2026-04-20 07:53:01.855705984 +0000 UTC m=+179.029033413" watchObservedRunningTime="2026-04-20 07:53:01.857042196 +0000 UTC m=+179.030369625" Apr 20 07:53:02.837564 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:02.837530 2575 generic.go:358] "Generic (PLEG): container finished" podID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerID="3be3f96e59beed94abfe2dc15da3a224f92eaef36a412a02c4832a447948ef49" exitCode=0 Apr 20 07:53:02.837978 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:02.837632 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39","Type":"ContainerDied","Data":"3be3f96e59beed94abfe2dc15da3a224f92eaef36a412a02c4832a447948ef49"} Apr 20 07:53:04.850034 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:04.849988 2575 patch_prober.go:28] interesting pod/image-registry-7bd5554bd9-jrm2x container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 07:53:04.850436 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:04.850049 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" podUID="397df66e-7585-4bc3-aaa7-dcdc37635fc5" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 07:53:06.762471 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:06.762444 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:53:06.852099 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:06.852069 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39","Type":"ContainerStarted","Data":"17b131269b33c53c7c59ae5b4788740822d947d4df21565157e666856607ce79"} Apr 20 07:53:06.852099 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:06.852101 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39","Type":"ContainerStarted","Data":"932f56abf3b4eb35db8c273278f0a3577b9139b698645cf5ec1c13178151a7e0"} Apr 20 07:53:06.852277 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:06.852111 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39","Type":"ContainerStarted","Data":"61f6eecd6bae27dbd2dacfd6bef766491d5d694257c114e4aa9413f410ec9a86"} Apr 20 07:53:06.852277 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:06.852121 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39","Type":"ContainerStarted","Data":"661ffabef5963a2d97ca20bf50c2513abd107dc3dd564607222d4a9516a0b06a"} Apr 20 07:53:06.852277 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:06.852130 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39","Type":"ContainerStarted","Data":"ecce3c069fce4a7a9117c7fc8e19ce2c76ad94f106d04ab1bd2790ea906e3b0a"} Apr 20 07:53:06.852277 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:06.852139 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39","Type":"ContainerStarted","Data":"03e96809214c8727175e151a8d7165ec5e15ce92a10f089ca44b895f52a891eb"} Apr 20 07:53:06.878632 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:06.878575 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.833372113 podStartE2EDuration="5.878560747s" podCreationTimestamp="2026-04-20 07:53:01 +0000 UTC" firstStartedPulling="2026-04-20 07:53:01.762663739 +0000 UTC m=+178.935991146" lastFinishedPulling="2026-04-20 07:53:05.807852373 +0000 UTC m=+182.981179780" observedRunningTime="2026-04-20 07:53:06.876706672 +0000 UTC m=+184.050034102" watchObservedRunningTime="2026-04-20 07:53:06.878560747 +0000 UTC m=+184.051888175" Apr 20 07:53:07.844876 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:07.844834 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-67dfbc55cc-xxt72" Apr 20 07:53:09.206774 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:09.206735 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7bd5554bd9-jrm2x"] Apr 20 07:53:11.626496 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:11.626456 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:53:34.225694 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.225639 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" podUID="397df66e-7585-4bc3-aaa7-dcdc37635fc5" containerName="registry" containerID="cri-o://a0b85fd28e0d12784c4e6e841d379b8de9325cd60ece03de81b3e77af83bea15" gracePeriod=30 Apr 20 07:53:34.471633 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.471592 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:53:34.609666 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.609547 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls\") pod \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " Apr 20 07:53:34.609666 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.609640 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/397df66e-7585-4bc3-aaa7-dcdc37635fc5-installation-pull-secrets\") pod \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " Apr 20 07:53:34.609908 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.609673 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-certificates\") pod \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " Apr 20 07:53:34.609908 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.609690 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr5h5\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-kube-api-access-vr5h5\") pod \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " Apr 20 07:53:34.609908 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.609727 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/397df66e-7585-4bc3-aaa7-dcdc37635fc5-image-registry-private-configuration\") pod \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " Apr 20 07:53:34.609908 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.609759 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/397df66e-7585-4bc3-aaa7-dcdc37635fc5-ca-trust-extracted\") pod \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " Apr 20 07:53:34.609908 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.609786 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/397df66e-7585-4bc3-aaa7-dcdc37635fc5-trusted-ca\") pod \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " Apr 20 07:53:34.609908 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.609820 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-bound-sa-token\") pod \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\" (UID: \"397df66e-7585-4bc3-aaa7-dcdc37635fc5\") " Apr 20 07:53:34.610196 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.610169 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "397df66e-7585-4bc3-aaa7-dcdc37635fc5" (UID: "397df66e-7585-4bc3-aaa7-dcdc37635fc5"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:53:34.610633 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.610591 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/397df66e-7585-4bc3-aaa7-dcdc37635fc5-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "397df66e-7585-4bc3-aaa7-dcdc37635fc5" (UID: "397df66e-7585-4bc3-aaa7-dcdc37635fc5"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:53:34.612656 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.612570 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "397df66e-7585-4bc3-aaa7-dcdc37635fc5" (UID: "397df66e-7585-4bc3-aaa7-dcdc37635fc5"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:53:34.612656 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.612584 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/397df66e-7585-4bc3-aaa7-dcdc37635fc5-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "397df66e-7585-4bc3-aaa7-dcdc37635fc5" (UID: "397df66e-7585-4bc3-aaa7-dcdc37635fc5"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:53:34.612656 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.612580 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-kube-api-access-vr5h5" (OuterVolumeSpecName: "kube-api-access-vr5h5") pod "397df66e-7585-4bc3-aaa7-dcdc37635fc5" (UID: "397df66e-7585-4bc3-aaa7-dcdc37635fc5"). InnerVolumeSpecName "kube-api-access-vr5h5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:53:34.612898 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.612694 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/397df66e-7585-4bc3-aaa7-dcdc37635fc5-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "397df66e-7585-4bc3-aaa7-dcdc37635fc5" (UID: "397df66e-7585-4bc3-aaa7-dcdc37635fc5"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:53:34.612898 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.612708 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "397df66e-7585-4bc3-aaa7-dcdc37635fc5" (UID: "397df66e-7585-4bc3-aaa7-dcdc37635fc5"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:53:34.618507 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.618477 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/397df66e-7585-4bc3-aaa7-dcdc37635fc5-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "397df66e-7585-4bc3-aaa7-dcdc37635fc5" (UID: "397df66e-7585-4bc3-aaa7-dcdc37635fc5"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:53:34.711282 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.711237 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/397df66e-7585-4bc3-aaa7-dcdc37635fc5-installation-pull-secrets\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:53:34.711282 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.711278 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-certificates\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:53:34.711282 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.711289 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vr5h5\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-kube-api-access-vr5h5\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:53:34.711282 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.711300 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/397df66e-7585-4bc3-aaa7-dcdc37635fc5-image-registry-private-configuration\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:53:34.711551 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.711310 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/397df66e-7585-4bc3-aaa7-dcdc37635fc5-ca-trust-extracted\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:53:34.711551 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.711320 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/397df66e-7585-4bc3-aaa7-dcdc37635fc5-trusted-ca\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:53:34.711551 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.711329 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-bound-sa-token\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:53:34.711551 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.711336 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/397df66e-7585-4bc3-aaa7-dcdc37635fc5-registry-tls\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:53:34.930937 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.930905 2575 generic.go:358] "Generic (PLEG): container finished" podID="397df66e-7585-4bc3-aaa7-dcdc37635fc5" containerID="a0b85fd28e0d12784c4e6e841d379b8de9325cd60ece03de81b3e77af83bea15" exitCode=0 Apr 20 07:53:34.931105 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.930962 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" Apr 20 07:53:34.931105 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.930991 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" event={"ID":"397df66e-7585-4bc3-aaa7-dcdc37635fc5","Type":"ContainerDied","Data":"a0b85fd28e0d12784c4e6e841d379b8de9325cd60ece03de81b3e77af83bea15"} Apr 20 07:53:34.931105 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.931033 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bd5554bd9-jrm2x" event={"ID":"397df66e-7585-4bc3-aaa7-dcdc37635fc5","Type":"ContainerDied","Data":"acde0e5f9c6798dded049a059d3184f1bc4c2b57731b533f9c5f3cd1261c1c59"} Apr 20 07:53:34.931105 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.931049 2575 scope.go:117] "RemoveContainer" containerID="a0b85fd28e0d12784c4e6e841d379b8de9325cd60ece03de81b3e77af83bea15" Apr 20 07:53:34.944234 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.944215 2575 scope.go:117] "RemoveContainer" containerID="a0b85fd28e0d12784c4e6e841d379b8de9325cd60ece03de81b3e77af83bea15" Apr 20 07:53:34.944504 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:53:34.944481 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0b85fd28e0d12784c4e6e841d379b8de9325cd60ece03de81b3e77af83bea15\": container with ID starting with a0b85fd28e0d12784c4e6e841d379b8de9325cd60ece03de81b3e77af83bea15 not found: ID does not exist" containerID="a0b85fd28e0d12784c4e6e841d379b8de9325cd60ece03de81b3e77af83bea15" Apr 20 07:53:34.944553 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.944513 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0b85fd28e0d12784c4e6e841d379b8de9325cd60ece03de81b3e77af83bea15"} err="failed to get container status \"a0b85fd28e0d12784c4e6e841d379b8de9325cd60ece03de81b3e77af83bea15\": rpc error: code = NotFound desc = could not find container \"a0b85fd28e0d12784c4e6e841d379b8de9325cd60ece03de81b3e77af83bea15\": container with ID starting with a0b85fd28e0d12784c4e6e841d379b8de9325cd60ece03de81b3e77af83bea15 not found: ID does not exist" Apr 20 07:53:34.957247 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.957213 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7bd5554bd9-jrm2x"] Apr 20 07:53:34.960719 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:34.960693 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7bd5554bd9-jrm2x"] Apr 20 07:53:35.291330 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:35.291257 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="397df66e-7585-4bc3-aaa7-dcdc37635fc5" path="/var/lib/kubelet/pods/397df66e-7585-4bc3-aaa7-dcdc37635fc5/volumes" Apr 20 07:53:38.944658 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:38.944600 2575 generic.go:358] "Generic (PLEG): container finished" podID="1e9b4311-a667-4cd6-bc63-7df6eb349627" containerID="07ed189a190248310f7dc3ebe86a783ac3aa4d6f618ae1016e939ad6888ad319" exitCode=0 Apr 20 07:53:38.945072 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:38.944675 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ztlpb" event={"ID":"1e9b4311-a667-4cd6-bc63-7df6eb349627","Type":"ContainerDied","Data":"07ed189a190248310f7dc3ebe86a783ac3aa4d6f618ae1016e939ad6888ad319"} Apr 20 07:53:38.945072 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:38.944971 2575 scope.go:117] "RemoveContainer" containerID="07ed189a190248310f7dc3ebe86a783ac3aa4d6f618ae1016e939ad6888ad319" Apr 20 07:53:39.948972 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:53:39.948937 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-ztlpb" event={"ID":"1e9b4311-a667-4cd6-bc63-7df6eb349627","Type":"ContainerStarted","Data":"547765ff4b7862793dde8c3e6a47d0c7599ddff1cf631fef9de7fd83dfede80c"} Apr 20 07:54:01.625928 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:01.625891 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:01.644486 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:01.644457 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:02.027020 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:02.026949 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:15.135396 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:15.135346 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs\") pod \"network-metrics-daemon-7qpdh\" (UID: \"46b8deca-b33b-45e1-9131-b47fde192a78\") " pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:54:15.137789 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:15.137765 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46b8deca-b33b-45e1-9131-b47fde192a78-metrics-certs\") pod \"network-metrics-daemon-7qpdh\" (UID: \"46b8deca-b33b-45e1-9131-b47fde192a78\") " pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:54:15.389278 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:15.389194 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2g69h\"" Apr 20 07:54:15.397230 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:15.397210 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7qpdh" Apr 20 07:54:15.513246 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:15.513215 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7qpdh"] Apr 20 07:54:15.517411 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:54:15.517377 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46b8deca_b33b_45e1_9131_b47fde192a78.slice/crio-168f29d00eeb10d308c7d5fd390cf7eb49f5604ae926a010e7b9018ed48c0b3f WatchSource:0}: Error finding container 168f29d00eeb10d308c7d5fd390cf7eb49f5604ae926a010e7b9018ed48c0b3f: Status 404 returned error can't find the container with id 168f29d00eeb10d308c7d5fd390cf7eb49f5604ae926a010e7b9018ed48c0b3f Apr 20 07:54:16.050695 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:16.050655 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7qpdh" event={"ID":"46b8deca-b33b-45e1-9131-b47fde192a78","Type":"ContainerStarted","Data":"168f29d00eeb10d308c7d5fd390cf7eb49f5604ae926a010e7b9018ed48c0b3f"} Apr 20 07:54:17.058657 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:17.058600 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7qpdh" event={"ID":"46b8deca-b33b-45e1-9131-b47fde192a78","Type":"ContainerStarted","Data":"82949eb47fb10b6f81a889711f0c470050ef9ed677727568435f4d900454aecb"} Apr 20 07:54:17.058657 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:17.058660 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7qpdh" event={"ID":"46b8deca-b33b-45e1-9131-b47fde192a78","Type":"ContainerStarted","Data":"64cd1965e7026e0a324915b40777adeba53c0e15d465f4e8fa02db7392be4ae7"} Apr 20 07:54:17.076090 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:17.076041 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7qpdh" podStartSLOduration=253.125591574 podStartE2EDuration="4m14.076026814s" podCreationTimestamp="2026-04-20 07:50:03 +0000 UTC" firstStartedPulling="2026-04-20 07:54:15.51962449 +0000 UTC m=+252.692951911" lastFinishedPulling="2026-04-20 07:54:16.470059744 +0000 UTC m=+253.643387151" observedRunningTime="2026-04-20 07:54:17.074859091 +0000 UTC m=+254.248186521" watchObservedRunningTime="2026-04-20 07:54:17.076026814 +0000 UTC m=+254.249354242" Apr 20 07:54:19.687377 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:19.687344 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 07:54:19.687972 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:19.687946 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="prometheus" containerID="cri-o://03e96809214c8727175e151a8d7165ec5e15ce92a10f089ca44b895f52a891eb" gracePeriod=600 Apr 20 07:54:19.688063 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:19.687999 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="kube-rbac-proxy-thanos" containerID="cri-o://17b131269b33c53c7c59ae5b4788740822d947d4df21565157e666856607ce79" gracePeriod=600 Apr 20 07:54:19.688063 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:19.688023 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="kube-rbac-proxy-web" containerID="cri-o://61f6eecd6bae27dbd2dacfd6bef766491d5d694257c114e4aa9413f410ec9a86" gracePeriod=600 Apr 20 07:54:19.688160 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:19.688087 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="kube-rbac-proxy" containerID="cri-o://932f56abf3b4eb35db8c273278f0a3577b9139b698645cf5ec1c13178151a7e0" gracePeriod=600 Apr 20 07:54:19.688160 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:19.688096 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="config-reloader" containerID="cri-o://ecce3c069fce4a7a9117c7fc8e19ce2c76ad94f106d04ab1bd2790ea906e3b0a" gracePeriod=600 Apr 20 07:54:19.688160 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:19.688100 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="thanos-sidecar" containerID="cri-o://661ffabef5963a2d97ca20bf50c2513abd107dc3dd564607222d4a9516a0b06a" gracePeriod=600 Apr 20 07:54:20.072488 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:20.072378 2575 generic.go:358] "Generic (PLEG): container finished" podID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerID="17b131269b33c53c7c59ae5b4788740822d947d4df21565157e666856607ce79" exitCode=0 Apr 20 07:54:20.072488 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:20.072400 2575 generic.go:358] "Generic (PLEG): container finished" podID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerID="932f56abf3b4eb35db8c273278f0a3577b9139b698645cf5ec1c13178151a7e0" exitCode=0 Apr 20 07:54:20.072488 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:20.072408 2575 generic.go:358] "Generic (PLEG): container finished" podID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerID="661ffabef5963a2d97ca20bf50c2513abd107dc3dd564607222d4a9516a0b06a" exitCode=0 Apr 20 07:54:20.072488 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:20.072416 2575 generic.go:358] "Generic (PLEG): container finished" podID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerID="ecce3c069fce4a7a9117c7fc8e19ce2c76ad94f106d04ab1bd2790ea906e3b0a" exitCode=0 Apr 20 07:54:20.072488 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:20.072421 2575 generic.go:358] "Generic (PLEG): container finished" podID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerID="03e96809214c8727175e151a8d7165ec5e15ce92a10f089ca44b895f52a891eb" exitCode=0 Apr 20 07:54:20.072488 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:20.072452 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39","Type":"ContainerDied","Data":"17b131269b33c53c7c59ae5b4788740822d947d4df21565157e666856607ce79"} Apr 20 07:54:20.072488 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:20.072491 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39","Type":"ContainerDied","Data":"932f56abf3b4eb35db8c273278f0a3577b9139b698645cf5ec1c13178151a7e0"} Apr 20 07:54:20.072840 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:20.072506 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39","Type":"ContainerDied","Data":"661ffabef5963a2d97ca20bf50c2513abd107dc3dd564607222d4a9516a0b06a"} Apr 20 07:54:20.072840 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:20.072517 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39","Type":"ContainerDied","Data":"ecce3c069fce4a7a9117c7fc8e19ce2c76ad94f106d04ab1bd2790ea906e3b0a"} Apr 20 07:54:20.072840 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:20.072529 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39","Type":"ContainerDied","Data":"03e96809214c8727175e151a8d7165ec5e15ce92a10f089ca44b895f52a891eb"} Apr 20 07:54:20.930054 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:20.930034 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.078976 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.078888 2575 generic.go:358] "Generic (PLEG): container finished" podID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerID="61f6eecd6bae27dbd2dacfd6bef766491d5d694257c114e4aa9413f410ec9a86" exitCode=0 Apr 20 07:54:21.078976 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.078959 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39","Type":"ContainerDied","Data":"61f6eecd6bae27dbd2dacfd6bef766491d5d694257c114e4aa9413f410ec9a86"} Apr 20 07:54:21.079148 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.078988 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39","Type":"ContainerDied","Data":"75ae7783822f6ce0e115321a63afd50e128a33ef5098d585c95176b8c5850b19"} Apr 20 07:54:21.079148 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.079004 2575 scope.go:117] "RemoveContainer" containerID="17b131269b33c53c7c59ae5b4788740822d947d4df21565157e666856607ce79" Apr 20 07:54:21.079148 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.079003 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.084050 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.084024 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " Apr 20 07:54:21.084173 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.084056 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-prometheus-k8s-tls\") pod \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " Apr 20 07:54:21.084173 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.084076 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " Apr 20 07:54:21.084173 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.084102 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-web-config\") pod \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " Apr 20 07:54:21.084173 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.084147 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-metrics-client-certs\") pod \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " Apr 20 07:54:21.084378 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.084176 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-kube-rbac-proxy\") pod \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " Apr 20 07:54:21.084378 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.084246 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-prometheus-k8s-rulefiles-0\") pod \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " Apr 20 07:54:21.084378 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.084277 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-prometheus-k8s-db\") pod \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " Apr 20 07:54:21.084378 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.084347 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-config\") pod \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " Apr 20 07:54:21.084578 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.084379 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-configmap-serving-certs-ca-bundle\") pod \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " Apr 20 07:54:21.084578 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.084403 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-thanos-prometheus-http-client-file\") pod \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " Apr 20 07:54:21.084578 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.084457 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh9sq\" (UniqueName: \"kubernetes.io/projected/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-kube-api-access-hh9sq\") pod \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " Apr 20 07:54:21.084578 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.084486 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-prometheus-trusted-ca-bundle\") pod \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " Apr 20 07:54:21.084578 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.084511 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-configmap-kubelet-serving-ca-bundle\") pod \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " Apr 20 07:54:21.084578 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.084542 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-configmap-metrics-client-ca\") pod \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " Apr 20 07:54:21.084894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.084588 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-tls-assets\") pod \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " Apr 20 07:54:21.084894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.084628 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-config-out\") pod \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " Apr 20 07:54:21.084894 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.084653 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-grpc-tls\") pod \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\" (UID: \"3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39\") " Apr 20 07:54:21.085772 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.085556 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" (UID: "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:54:21.085772 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.085654 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" (UID: "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:54:21.085942 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.085867 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" (UID: "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:54:21.087195 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.087078 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" (UID: "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:54:21.087479 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.087408 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" (UID: "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:54:21.087547 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.087518 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" (UID: "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:54:21.087605 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.087546 2575 scope.go:117] "RemoveContainer" containerID="932f56abf3b4eb35db8c273278f0a3577b9139b698645cf5ec1c13178151a7e0" Apr 20 07:54:21.087744 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.087711 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" (UID: "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:54:21.088281 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.088049 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" (UID: "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:54:21.088465 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.088368 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" (UID: "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:54:21.089216 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.089182 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" (UID: "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:54:21.089314 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.089254 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" (UID: "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:54:21.089890 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.089847 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" (UID: "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:54:21.090543 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.090515 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-config-out" (OuterVolumeSpecName: "config-out") pod "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" (UID: "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:54:21.090652 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.090598 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" (UID: "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:54:21.091037 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.091002 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" (UID: "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:54:21.091309 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.091290 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-config" (OuterVolumeSpecName: "config") pod "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" (UID: "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:54:21.091438 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.091417 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-kube-api-access-hh9sq" (OuterVolumeSpecName: "kube-api-access-hh9sq") pod "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" (UID: "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39"). InnerVolumeSpecName "kube-api-access-hh9sq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:54:21.100244 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.100219 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-web-config" (OuterVolumeSpecName: "web-config") pod "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" (UID: "3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:54:21.106686 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.106665 2575 scope.go:117] "RemoveContainer" containerID="61f6eecd6bae27dbd2dacfd6bef766491d5d694257c114e4aa9413f410ec9a86" Apr 20 07:54:21.112921 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.112897 2575 scope.go:117] "RemoveContainer" containerID="661ffabef5963a2d97ca20bf50c2513abd107dc3dd564607222d4a9516a0b06a" Apr 20 07:54:21.119032 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.119015 2575 scope.go:117] "RemoveContainer" containerID="ecce3c069fce4a7a9117c7fc8e19ce2c76ad94f106d04ab1bd2790ea906e3b0a" Apr 20 07:54:21.126016 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.125996 2575 scope.go:117] "RemoveContainer" containerID="03e96809214c8727175e151a8d7165ec5e15ce92a10f089ca44b895f52a891eb" Apr 20 07:54:21.132455 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.132440 2575 scope.go:117] "RemoveContainer" containerID="3be3f96e59beed94abfe2dc15da3a224f92eaef36a412a02c4832a447948ef49" Apr 20 07:54:21.138416 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.138400 2575 scope.go:117] "RemoveContainer" containerID="17b131269b33c53c7c59ae5b4788740822d947d4df21565157e666856607ce79" Apr 20 07:54:21.138696 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:54:21.138672 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17b131269b33c53c7c59ae5b4788740822d947d4df21565157e666856607ce79\": container with ID starting with 17b131269b33c53c7c59ae5b4788740822d947d4df21565157e666856607ce79 not found: ID does not exist" containerID="17b131269b33c53c7c59ae5b4788740822d947d4df21565157e666856607ce79" Apr 20 07:54:21.138784 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.138702 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17b131269b33c53c7c59ae5b4788740822d947d4df21565157e666856607ce79"} err="failed to get container status \"17b131269b33c53c7c59ae5b4788740822d947d4df21565157e666856607ce79\": rpc error: code = NotFound desc = could not find container \"17b131269b33c53c7c59ae5b4788740822d947d4df21565157e666856607ce79\": container with ID starting with 17b131269b33c53c7c59ae5b4788740822d947d4df21565157e666856607ce79 not found: ID does not exist" Apr 20 07:54:21.138784 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.138721 2575 scope.go:117] "RemoveContainer" containerID="932f56abf3b4eb35db8c273278f0a3577b9139b698645cf5ec1c13178151a7e0" Apr 20 07:54:21.138959 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:54:21.138945 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"932f56abf3b4eb35db8c273278f0a3577b9139b698645cf5ec1c13178151a7e0\": container with ID starting with 932f56abf3b4eb35db8c273278f0a3577b9139b698645cf5ec1c13178151a7e0 not found: ID does not exist" containerID="932f56abf3b4eb35db8c273278f0a3577b9139b698645cf5ec1c13178151a7e0" Apr 20 07:54:21.139000 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.138963 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"932f56abf3b4eb35db8c273278f0a3577b9139b698645cf5ec1c13178151a7e0"} err="failed to get container status \"932f56abf3b4eb35db8c273278f0a3577b9139b698645cf5ec1c13178151a7e0\": rpc error: code = NotFound desc = could not find container \"932f56abf3b4eb35db8c273278f0a3577b9139b698645cf5ec1c13178151a7e0\": container with ID starting with 932f56abf3b4eb35db8c273278f0a3577b9139b698645cf5ec1c13178151a7e0 not found: ID does not exist" Apr 20 07:54:21.139000 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.138976 2575 scope.go:117] "RemoveContainer" containerID="61f6eecd6bae27dbd2dacfd6bef766491d5d694257c114e4aa9413f410ec9a86" Apr 20 07:54:21.139196 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:54:21.139180 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61f6eecd6bae27dbd2dacfd6bef766491d5d694257c114e4aa9413f410ec9a86\": container with ID starting with 61f6eecd6bae27dbd2dacfd6bef766491d5d694257c114e4aa9413f410ec9a86 not found: ID does not exist" containerID="61f6eecd6bae27dbd2dacfd6bef766491d5d694257c114e4aa9413f410ec9a86" Apr 20 07:54:21.139232 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.139200 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61f6eecd6bae27dbd2dacfd6bef766491d5d694257c114e4aa9413f410ec9a86"} err="failed to get container status \"61f6eecd6bae27dbd2dacfd6bef766491d5d694257c114e4aa9413f410ec9a86\": rpc error: code = NotFound desc = could not find container \"61f6eecd6bae27dbd2dacfd6bef766491d5d694257c114e4aa9413f410ec9a86\": container with ID starting with 61f6eecd6bae27dbd2dacfd6bef766491d5d694257c114e4aa9413f410ec9a86 not found: ID does not exist" Apr 20 07:54:21.139232 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.139216 2575 scope.go:117] "RemoveContainer" containerID="661ffabef5963a2d97ca20bf50c2513abd107dc3dd564607222d4a9516a0b06a" Apr 20 07:54:21.139412 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:54:21.139393 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"661ffabef5963a2d97ca20bf50c2513abd107dc3dd564607222d4a9516a0b06a\": container with ID starting with 661ffabef5963a2d97ca20bf50c2513abd107dc3dd564607222d4a9516a0b06a not found: ID does not exist" containerID="661ffabef5963a2d97ca20bf50c2513abd107dc3dd564607222d4a9516a0b06a" Apr 20 07:54:21.139479 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.139421 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"661ffabef5963a2d97ca20bf50c2513abd107dc3dd564607222d4a9516a0b06a"} err="failed to get container status \"661ffabef5963a2d97ca20bf50c2513abd107dc3dd564607222d4a9516a0b06a\": rpc error: code = NotFound desc = could not find container \"661ffabef5963a2d97ca20bf50c2513abd107dc3dd564607222d4a9516a0b06a\": container with ID starting with 661ffabef5963a2d97ca20bf50c2513abd107dc3dd564607222d4a9516a0b06a not found: ID does not exist" Apr 20 07:54:21.139479 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.139444 2575 scope.go:117] "RemoveContainer" containerID="ecce3c069fce4a7a9117c7fc8e19ce2c76ad94f106d04ab1bd2790ea906e3b0a" Apr 20 07:54:21.139678 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:54:21.139661 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecce3c069fce4a7a9117c7fc8e19ce2c76ad94f106d04ab1bd2790ea906e3b0a\": container with ID starting with ecce3c069fce4a7a9117c7fc8e19ce2c76ad94f106d04ab1bd2790ea906e3b0a not found: ID does not exist" containerID="ecce3c069fce4a7a9117c7fc8e19ce2c76ad94f106d04ab1bd2790ea906e3b0a" Apr 20 07:54:21.139753 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.139679 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecce3c069fce4a7a9117c7fc8e19ce2c76ad94f106d04ab1bd2790ea906e3b0a"} err="failed to get container status \"ecce3c069fce4a7a9117c7fc8e19ce2c76ad94f106d04ab1bd2790ea906e3b0a\": rpc error: code = NotFound desc = could not find container \"ecce3c069fce4a7a9117c7fc8e19ce2c76ad94f106d04ab1bd2790ea906e3b0a\": container with ID starting with ecce3c069fce4a7a9117c7fc8e19ce2c76ad94f106d04ab1bd2790ea906e3b0a not found: ID does not exist" Apr 20 07:54:21.139753 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.139690 2575 scope.go:117] "RemoveContainer" containerID="03e96809214c8727175e151a8d7165ec5e15ce92a10f089ca44b895f52a891eb" Apr 20 07:54:21.139900 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:54:21.139880 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03e96809214c8727175e151a8d7165ec5e15ce92a10f089ca44b895f52a891eb\": container with ID starting with 03e96809214c8727175e151a8d7165ec5e15ce92a10f089ca44b895f52a891eb not found: ID does not exist" containerID="03e96809214c8727175e151a8d7165ec5e15ce92a10f089ca44b895f52a891eb" Apr 20 07:54:21.139962 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.139911 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03e96809214c8727175e151a8d7165ec5e15ce92a10f089ca44b895f52a891eb"} err="failed to get container status \"03e96809214c8727175e151a8d7165ec5e15ce92a10f089ca44b895f52a891eb\": rpc error: code = NotFound desc = could not find container \"03e96809214c8727175e151a8d7165ec5e15ce92a10f089ca44b895f52a891eb\": container with ID starting with 03e96809214c8727175e151a8d7165ec5e15ce92a10f089ca44b895f52a891eb not found: ID does not exist" Apr 20 07:54:21.139962 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.139935 2575 scope.go:117] "RemoveContainer" containerID="3be3f96e59beed94abfe2dc15da3a224f92eaef36a412a02c4832a447948ef49" Apr 20 07:54:21.140152 ip-10-0-129-24 kubenswrapper[2575]: E0420 07:54:21.140137 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3be3f96e59beed94abfe2dc15da3a224f92eaef36a412a02c4832a447948ef49\": container with ID starting with 3be3f96e59beed94abfe2dc15da3a224f92eaef36a412a02c4832a447948ef49 not found: ID does not exist" containerID="3be3f96e59beed94abfe2dc15da3a224f92eaef36a412a02c4832a447948ef49" Apr 20 07:54:21.140190 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.140157 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3be3f96e59beed94abfe2dc15da3a224f92eaef36a412a02c4832a447948ef49"} err="failed to get container status \"3be3f96e59beed94abfe2dc15da3a224f92eaef36a412a02c4832a447948ef49\": rpc error: code = NotFound desc = could not find container \"3be3f96e59beed94abfe2dc15da3a224f92eaef36a412a02c4832a447948ef49\": container with ID starting with 3be3f96e59beed94abfe2dc15da3a224f92eaef36a412a02c4832a447948ef49 not found: ID does not exist" Apr 20 07:54:21.185680 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.185641 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hh9sq\" (UniqueName: \"kubernetes.io/projected/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-kube-api-access-hh9sq\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:54:21.185680 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.185676 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-prometheus-trusted-ca-bundle\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:54:21.185680 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.185688 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:54:21.185868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.185700 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-configmap-metrics-client-ca\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:54:21.185868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.185710 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-tls-assets\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:54:21.185868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.185719 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-config-out\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:54:21.185868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.185727 2575 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-grpc-tls\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:54:21.185868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.185735 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:54:21.185868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.185745 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-prometheus-k8s-tls\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:54:21.185868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.185754 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:54:21.185868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.185763 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-web-config\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:54:21.185868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.185771 2575 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-metrics-client-certs\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:54:21.185868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.185780 2575 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-secret-kube-rbac-proxy\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:54:21.185868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.185790 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:54:21.185868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.185800 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-prometheus-k8s-db\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:54:21.185868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.185808 2575 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-config\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:54:21.185868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.185817 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:54:21.185868 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.185826 2575 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39-thanos-prometheus-http-client-file\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 07:54:21.402506 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.400279 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 07:54:21.405021 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.404991 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 07:54:21.425152 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425125 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 07:54:21.425440 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425425 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="397df66e-7585-4bc3-aaa7-dcdc37635fc5" containerName="registry" Apr 20 07:54:21.425508 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425443 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="397df66e-7585-4bc3-aaa7-dcdc37635fc5" containerName="registry" Apr 20 07:54:21.425508 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425456 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="thanos-sidecar" Apr 20 07:54:21.425508 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425465 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="thanos-sidecar" Apr 20 07:54:21.425508 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425483 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="prometheus" Apr 20 07:54:21.425508 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425491 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="prometheus" Apr 20 07:54:21.425790 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425510 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="kube-rbac-proxy-web" Apr 20 07:54:21.425790 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425518 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="kube-rbac-proxy-web" Apr 20 07:54:21.425790 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425534 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="kube-rbac-proxy" Apr 20 07:54:21.425790 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425541 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="kube-rbac-proxy" Apr 20 07:54:21.425790 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425553 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="init-config-reloader" Apr 20 07:54:21.425790 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425563 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="init-config-reloader" Apr 20 07:54:21.425790 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425571 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="config-reloader" Apr 20 07:54:21.425790 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425579 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="config-reloader" Apr 20 07:54:21.425790 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425592 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="kube-rbac-proxy-thanos" Apr 20 07:54:21.425790 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425601 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="kube-rbac-proxy-thanos" Apr 20 07:54:21.425790 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425679 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="prometheus" Apr 20 07:54:21.425790 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425692 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="kube-rbac-proxy" Apr 20 07:54:21.425790 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425702 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="kube-rbac-proxy-thanos" Apr 20 07:54:21.425790 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425713 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="thanos-sidecar" Apr 20 07:54:21.425790 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425725 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="kube-rbac-proxy-web" Apr 20 07:54:21.425790 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425736 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="397df66e-7585-4bc3-aaa7-dcdc37635fc5" containerName="registry" Apr 20 07:54:21.425790 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.425746 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" containerName="config-reloader" Apr 20 07:54:21.431274 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.431255 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.434215 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.434196 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 07:54:21.434430 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.434404 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-20pri24etom4j\"" Apr 20 07:54:21.434546 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.434450 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 07:54:21.434546 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.434404 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 07:54:21.434546 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.434414 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 07:54:21.434718 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.434694 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 07:54:21.434780 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.434721 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 07:54:21.434780 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.434725 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 07:54:21.434780 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.434741 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 07:54:21.435555 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.435536 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 07:54:21.435555 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.435550 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 07:54:21.435713 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.435667 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-jvrkd\"" Apr 20 07:54:21.435713 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.435691 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 07:54:21.437476 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.437452 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 07:54:21.441019 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.439701 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 07:54:21.442378 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.442359 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 07:54:21.589312 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.589275 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e53bbd8-7519-4447-aff6-96c8adf1d541-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.589312 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.589313 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl9vl\" (UniqueName: \"kubernetes.io/projected/5e53bbd8-7519-4447-aff6-96c8adf1d541-kube-api-access-pl9vl\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.589569 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.589339 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e53bbd8-7519-4447-aff6-96c8adf1d541-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.589569 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.589357 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.589569 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.589407 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.589569 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.589460 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.589569 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.589487 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.589569 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.589517 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5e53bbd8-7519-4447-aff6-96c8adf1d541-config-out\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.589569 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.589542 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.589569 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.589566 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5e53bbd8-7519-4447-aff6-96c8adf1d541-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.589939 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.589647 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e53bbd8-7519-4447-aff6-96c8adf1d541-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.589939 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.589679 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5e53bbd8-7519-4447-aff6-96c8adf1d541-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.589939 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.589703 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.589939 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.589732 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-config\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.589939 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.589806 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.589939 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.589834 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-web-config\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.589939 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.589851 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5e53bbd8-7519-4447-aff6-96c8adf1d541-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.589939 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.589878 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e53bbd8-7519-4447-aff6-96c8adf1d541-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.691291 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.691212 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pl9vl\" (UniqueName: \"kubernetes.io/projected/5e53bbd8-7519-4447-aff6-96c8adf1d541-kube-api-access-pl9vl\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.691291 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.691252 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e53bbd8-7519-4447-aff6-96c8adf1d541-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.691291 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.691272 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.691291 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.691291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.691575 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.691316 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.691575 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.691434 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.691575 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.691480 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5e53bbd8-7519-4447-aff6-96c8adf1d541-config-out\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.691575 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.691509 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.691575 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.691534 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5e53bbd8-7519-4447-aff6-96c8adf1d541-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.691575 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.691567 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e53bbd8-7519-4447-aff6-96c8adf1d541-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.691884 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.691631 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5e53bbd8-7519-4447-aff6-96c8adf1d541-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.691884 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.691663 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.691884 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.691694 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-config\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.691884 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.691729 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.691884 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.691758 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-web-config\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.691884 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.691785 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5e53bbd8-7519-4447-aff6-96c8adf1d541-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.691884 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.691830 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e53bbd8-7519-4447-aff6-96c8adf1d541-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.691884 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.691877 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e53bbd8-7519-4447-aff6-96c8adf1d541-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.692262 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.692057 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e53bbd8-7519-4447-aff6-96c8adf1d541-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.692329 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.692308 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5e53bbd8-7519-4447-aff6-96c8adf1d541-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.692561 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.692503 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e53bbd8-7519-4447-aff6-96c8adf1d541-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.693326 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.693294 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e53bbd8-7519-4447-aff6-96c8adf1d541-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.694753 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.694370 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e53bbd8-7519-4447-aff6-96c8adf1d541-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.694753 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.694712 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.695394 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.694921 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-web-config\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.695394 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.695211 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5e53bbd8-7519-4447-aff6-96c8adf1d541-config-out\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.695394 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.695225 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.695555 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.695482 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.695738 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.695717 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5e53bbd8-7519-4447-aff6-96c8adf1d541-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.696255 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.696215 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.696361 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.696345 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.696416 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.696397 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.696984 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.696961 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-config\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.697070 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.697018 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5e53bbd8-7519-4447-aff6-96c8adf1d541-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.697264 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.697247 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5e53bbd8-7519-4447-aff6-96c8adf1d541-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.698675 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.698659 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl9vl\" (UniqueName: \"kubernetes.io/projected/5e53bbd8-7519-4447-aff6-96c8adf1d541-kube-api-access-pl9vl\") pod \"prometheus-k8s-0\" (UID: \"5e53bbd8-7519-4447-aff6-96c8adf1d541\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.742893 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.742866 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:54:21.868321 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:21.868260 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 07:54:21.871474 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:54:21.871434 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e53bbd8_7519_4447_aff6_96c8adf1d541.slice/crio-81b5ea94c1947d3f0b757cbdca8dd4083040af56ff6154d7fe39360e26eb5d5b WatchSource:0}: Error finding container 81b5ea94c1947d3f0b757cbdca8dd4083040af56ff6154d7fe39360e26eb5d5b: Status 404 returned error can't find the container with id 81b5ea94c1947d3f0b757cbdca8dd4083040af56ff6154d7fe39360e26eb5d5b Apr 20 07:54:22.084263 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:22.084231 2575 generic.go:358] "Generic (PLEG): container finished" podID="5e53bbd8-7519-4447-aff6-96c8adf1d541" containerID="1f99a04d4d80db56bda7f1b855448deb3eaf6786f4744949a40a437b239910bc" exitCode=0 Apr 20 07:54:22.084638 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:22.084295 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e53bbd8-7519-4447-aff6-96c8adf1d541","Type":"ContainerDied","Data":"1f99a04d4d80db56bda7f1b855448deb3eaf6786f4744949a40a437b239910bc"} Apr 20 07:54:22.084638 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:22.084316 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e53bbd8-7519-4447-aff6-96c8adf1d541","Type":"ContainerStarted","Data":"81b5ea94c1947d3f0b757cbdca8dd4083040af56ff6154d7fe39360e26eb5d5b"} Apr 20 07:54:23.090474 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:23.090397 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e53bbd8-7519-4447-aff6-96c8adf1d541","Type":"ContainerStarted","Data":"62cdc47798a06baba9b165f5aea323989a1101f2a1662f03cf160ddd4316046b"} Apr 20 07:54:23.090474 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:23.090436 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e53bbd8-7519-4447-aff6-96c8adf1d541","Type":"ContainerStarted","Data":"0f6d06ddf7e52a8055bce3327e7edde26c45d7909e889aedeeb699ab53990198"} Apr 20 07:54:23.090474 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:23.090449 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e53bbd8-7519-4447-aff6-96c8adf1d541","Type":"ContainerStarted","Data":"597f00765acdf05354e0f979024e37c0e5d489ad074e85107b69d5ddaef674f6"} Apr 20 07:54:23.090474 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:23.090458 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e53bbd8-7519-4447-aff6-96c8adf1d541","Type":"ContainerStarted","Data":"37e03eca4ea9ba13f72eff690504a921684fe6c505d5e68ee37926e294dfab97"} Apr 20 07:54:23.090474 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:23.090468 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e53bbd8-7519-4447-aff6-96c8adf1d541","Type":"ContainerStarted","Data":"dac2086805df330ac8fea662c7da95cf61daea84c4f2cf559d295ce80fc75c30"} Apr 20 07:54:23.090474 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:23.090477 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5e53bbd8-7519-4447-aff6-96c8adf1d541","Type":"ContainerStarted","Data":"fb64ccd59fa334f55bb82dcf896fa37d23209a6d92f93d186b562479c7bbc83e"} Apr 20 07:54:23.122469 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:23.122407 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.122391035 podStartE2EDuration="2.122391035s" podCreationTimestamp="2026-04-20 07:54:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:54:23.120256784 +0000 UTC m=+260.293584214" watchObservedRunningTime="2026-04-20 07:54:23.122391035 +0000 UTC m=+260.295718465" Apr 20 07:54:23.289797 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:23.289768 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39" path="/var/lib/kubelet/pods/3ed85a0f-27c2-45c5-8dcd-c50bce2f5d39/volumes" Apr 20 07:54:26.743479 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:54:26.743441 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:55:03.241082 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:55:03.241052 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovn-acl-logging/0.log" Apr 20 07:55:03.241700 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:55:03.241119 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovn-acl-logging/0.log" Apr 20 07:55:03.245336 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:55:03.245313 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 07:55:21.743342 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:55:21.743302 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:55:21.758715 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:55:21.758691 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:55:22.275878 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:55:22.275850 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:57:34.134804 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:34.134728 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-ncdws"] Apr 20 07:57:34.138018 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:34.138000 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ncdws" Apr 20 07:57:34.140467 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:34.140440 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-sqgzf\"" Apr 20 07:57:34.140573 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:34.140454 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 07:57:34.141543 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:34.141518 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:57:34.144814 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:34.144739 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-ncdws"] Apr 20 07:57:34.166767 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:34.166731 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/936f0d6f-8aff-449d-a239-9d9ef533103f-tmp\") pod \"openshift-lws-operator-bfc7f696d-ncdws\" (UID: \"936f0d6f-8aff-449d-a239-9d9ef533103f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ncdws" Apr 20 07:57:34.166926 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:34.166783 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhzkc\" (UniqueName: \"kubernetes.io/projected/936f0d6f-8aff-449d-a239-9d9ef533103f-kube-api-access-jhzkc\") pod \"openshift-lws-operator-bfc7f696d-ncdws\" (UID: \"936f0d6f-8aff-449d-a239-9d9ef533103f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ncdws" Apr 20 07:57:34.267782 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:34.267748 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/936f0d6f-8aff-449d-a239-9d9ef533103f-tmp\") pod \"openshift-lws-operator-bfc7f696d-ncdws\" (UID: \"936f0d6f-8aff-449d-a239-9d9ef533103f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ncdws" Apr 20 07:57:34.267950 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:34.267802 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhzkc\" (UniqueName: \"kubernetes.io/projected/936f0d6f-8aff-449d-a239-9d9ef533103f-kube-api-access-jhzkc\") pod \"openshift-lws-operator-bfc7f696d-ncdws\" (UID: \"936f0d6f-8aff-449d-a239-9d9ef533103f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ncdws" Apr 20 07:57:34.268144 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:34.268114 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/936f0d6f-8aff-449d-a239-9d9ef533103f-tmp\") pod \"openshift-lws-operator-bfc7f696d-ncdws\" (UID: \"936f0d6f-8aff-449d-a239-9d9ef533103f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ncdws" Apr 20 07:57:34.275438 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:34.275412 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhzkc\" (UniqueName: \"kubernetes.io/projected/936f0d6f-8aff-449d-a239-9d9ef533103f-kube-api-access-jhzkc\") pod \"openshift-lws-operator-bfc7f696d-ncdws\" (UID: \"936f0d6f-8aff-449d-a239-9d9ef533103f\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ncdws" Apr 20 07:57:34.456243 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:34.456145 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ncdws" Apr 20 07:57:34.574857 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:34.574825 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-ncdws"] Apr 20 07:57:34.578038 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:57:34.578009 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod936f0d6f_8aff_449d_a239_9d9ef533103f.slice/crio-a89f43e2ae33d81431b4b4b79d0fcc829dffdb09128db6d7099f06e022602bbd WatchSource:0}: Error finding container a89f43e2ae33d81431b4b4b79d0fcc829dffdb09128db6d7099f06e022602bbd: Status 404 returned error can't find the container with id a89f43e2ae33d81431b4b4b79d0fcc829dffdb09128db6d7099f06e022602bbd Apr 20 07:57:34.579369 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:34.579351 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 07:57:34.627674 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:34.627640 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ncdws" event={"ID":"936f0d6f-8aff-449d-a239-9d9ef533103f","Type":"ContainerStarted","Data":"a89f43e2ae33d81431b4b4b79d0fcc829dffdb09128db6d7099f06e022602bbd"} Apr 20 07:57:37.638547 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:37.638515 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ncdws" event={"ID":"936f0d6f-8aff-449d-a239-9d9ef533103f","Type":"ContainerStarted","Data":"7652407ce2091ac001df83e180aef8b87f147eec3caa5ce04259cace760c7823"} Apr 20 07:57:37.654622 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:37.654572 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ncdws" podStartSLOduration=1.20064447 podStartE2EDuration="3.654557041s" podCreationTimestamp="2026-04-20 07:57:34 +0000 UTC" firstStartedPulling="2026-04-20 07:57:34.579505835 +0000 UTC m=+451.752833242" lastFinishedPulling="2026-04-20 07:57:37.033418403 +0000 UTC m=+454.206745813" observedRunningTime="2026-04-20 07:57:37.653046911 +0000 UTC m=+454.826374340" watchObservedRunningTime="2026-04-20 07:57:37.654557041 +0000 UTC m=+454.827884470" Apr 20 07:57:53.536608 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:53.536571 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-687c889b9-cxk5c"] Apr 20 07:57:53.539798 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:53.539777 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-cxk5c" Apr 20 07:57:53.542967 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:53.542936 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 07:57:53.542967 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:53.542959 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-9x2zw\"" Apr 20 07:57:53.543493 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:53.543475 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 07:57:53.543541 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:53.543508 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 07:57:53.546075 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:53.546054 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 07:57:53.559068 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:53.559040 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-687c889b9-cxk5c"] Apr 20 07:57:53.615921 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:53.615887 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/98cc77c3-3ed4-49be-914d-ae2b1d05c5dd-webhook-cert\") pod \"opendatahub-operator-controller-manager-687c889b9-cxk5c\" (UID: \"98cc77c3-3ed4-49be-914d-ae2b1d05c5dd\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-cxk5c" Apr 20 07:57:53.615921 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:53.615932 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhd2k\" (UniqueName: \"kubernetes.io/projected/98cc77c3-3ed4-49be-914d-ae2b1d05c5dd-kube-api-access-qhd2k\") pod \"opendatahub-operator-controller-manager-687c889b9-cxk5c\" (UID: \"98cc77c3-3ed4-49be-914d-ae2b1d05c5dd\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-cxk5c" Apr 20 07:57:53.616151 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:53.615955 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/98cc77c3-3ed4-49be-914d-ae2b1d05c5dd-apiservice-cert\") pod \"opendatahub-operator-controller-manager-687c889b9-cxk5c\" (UID: \"98cc77c3-3ed4-49be-914d-ae2b1d05c5dd\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-cxk5c" Apr 20 07:57:53.717349 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:53.717313 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/98cc77c3-3ed4-49be-914d-ae2b1d05c5dd-webhook-cert\") pod \"opendatahub-operator-controller-manager-687c889b9-cxk5c\" (UID: \"98cc77c3-3ed4-49be-914d-ae2b1d05c5dd\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-cxk5c" Apr 20 07:57:53.717560 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:53.717362 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhd2k\" (UniqueName: \"kubernetes.io/projected/98cc77c3-3ed4-49be-914d-ae2b1d05c5dd-kube-api-access-qhd2k\") pod \"opendatahub-operator-controller-manager-687c889b9-cxk5c\" (UID: \"98cc77c3-3ed4-49be-914d-ae2b1d05c5dd\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-cxk5c" Apr 20 07:57:53.717560 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:53.717425 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/98cc77c3-3ed4-49be-914d-ae2b1d05c5dd-apiservice-cert\") pod \"opendatahub-operator-controller-manager-687c889b9-cxk5c\" (UID: \"98cc77c3-3ed4-49be-914d-ae2b1d05c5dd\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-cxk5c" Apr 20 07:57:53.719945 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:53.719916 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/98cc77c3-3ed4-49be-914d-ae2b1d05c5dd-apiservice-cert\") pod \"opendatahub-operator-controller-manager-687c889b9-cxk5c\" (UID: \"98cc77c3-3ed4-49be-914d-ae2b1d05c5dd\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-cxk5c" Apr 20 07:57:53.720056 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:53.719947 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/98cc77c3-3ed4-49be-914d-ae2b1d05c5dd-webhook-cert\") pod \"opendatahub-operator-controller-manager-687c889b9-cxk5c\" (UID: \"98cc77c3-3ed4-49be-914d-ae2b1d05c5dd\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-cxk5c" Apr 20 07:57:53.741115 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:53.741084 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhd2k\" (UniqueName: \"kubernetes.io/projected/98cc77c3-3ed4-49be-914d-ae2b1d05c5dd-kube-api-access-qhd2k\") pod \"opendatahub-operator-controller-manager-687c889b9-cxk5c\" (UID: \"98cc77c3-3ed4-49be-914d-ae2b1d05c5dd\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-cxk5c" Apr 20 07:57:53.849834 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:53.849750 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-cxk5c" Apr 20 07:57:53.978043 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:53.978021 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-687c889b9-cxk5c"] Apr 20 07:57:53.980596 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:57:53.980570 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98cc77c3_3ed4_49be_914d_ae2b1d05c5dd.slice/crio-c6502d2e5c7ec30dfa46726828f7b05d5140370804aeee27f2b30fe5d0f487e1 WatchSource:0}: Error finding container c6502d2e5c7ec30dfa46726828f7b05d5140370804aeee27f2b30fe5d0f487e1: Status 404 returned error can't find the container with id c6502d2e5c7ec30dfa46726828f7b05d5140370804aeee27f2b30fe5d0f487e1 Apr 20 07:57:54.686273 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:54.686226 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-cxk5c" event={"ID":"98cc77c3-3ed4-49be-914d-ae2b1d05c5dd","Type":"ContainerStarted","Data":"c6502d2e5c7ec30dfa46726828f7b05d5140370804aeee27f2b30fe5d0f487e1"} Apr 20 07:57:56.694440 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:56.694402 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-cxk5c" event={"ID":"98cc77c3-3ed4-49be-914d-ae2b1d05c5dd","Type":"ContainerStarted","Data":"9d713595cdea3db17253819b5dee7ba1c8cf59bf30a452b9b6840380bf089cd5"} Apr 20 07:57:56.694886 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:56.694552 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-cxk5c" Apr 20 07:57:56.719390 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:57:56.719335 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-cxk5c" podStartSLOduration=1.350786852 podStartE2EDuration="3.719319015s" podCreationTimestamp="2026-04-20 07:57:53 +0000 UTC" firstStartedPulling="2026-04-20 07:57:53.982374881 +0000 UTC m=+471.155702288" lastFinishedPulling="2026-04-20 07:57:56.350907039 +0000 UTC m=+473.524234451" observedRunningTime="2026-04-20 07:57:56.716475962 +0000 UTC m=+473.889803391" watchObservedRunningTime="2026-04-20 07:57:56.719319015 +0000 UTC m=+473.892646445" Apr 20 07:58:07.700312 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:07.700283 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-cxk5c" Apr 20 07:58:08.831893 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:08.831858 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m"] Apr 20 07:58:08.836153 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:08.836136 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m" Apr 20 07:58:08.839863 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:08.839839 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 07:58:08.840001 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:08.839843 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 07:58:08.840001 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:08.839847 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 07:58:08.840001 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:08.839847 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-rd9xb\"" Apr 20 07:58:08.844008 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:08.843657 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m"] Apr 20 07:58:08.949644 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:08.949580 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzkcb\" (UniqueName: \"kubernetes.io/projected/e69badf5-d10f-40d6-8fab-6e2b81746b60-kube-api-access-mzkcb\") pod \"lws-controller-manager-56d8f7c9b7-5wj6m\" (UID: \"e69badf5-d10f-40d6-8fab-6e2b81746b60\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m" Apr 20 07:58:08.949825 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:08.949658 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e69badf5-d10f-40d6-8fab-6e2b81746b60-cert\") pod \"lws-controller-manager-56d8f7c9b7-5wj6m\" (UID: \"e69badf5-d10f-40d6-8fab-6e2b81746b60\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m" Apr 20 07:58:08.949825 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:08.949679 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e69badf5-d10f-40d6-8fab-6e2b81746b60-metrics-cert\") pod \"lws-controller-manager-56d8f7c9b7-5wj6m\" (UID: \"e69badf5-d10f-40d6-8fab-6e2b81746b60\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m" Apr 20 07:58:08.949825 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:08.949745 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e69badf5-d10f-40d6-8fab-6e2b81746b60-manager-config\") pod \"lws-controller-manager-56d8f7c9b7-5wj6m\" (UID: \"e69badf5-d10f-40d6-8fab-6e2b81746b60\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m" Apr 20 07:58:09.050637 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:09.050578 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e69badf5-d10f-40d6-8fab-6e2b81746b60-manager-config\") pod \"lws-controller-manager-56d8f7c9b7-5wj6m\" (UID: \"e69badf5-d10f-40d6-8fab-6e2b81746b60\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m" Apr 20 07:58:09.050808 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:09.050655 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzkcb\" (UniqueName: \"kubernetes.io/projected/e69badf5-d10f-40d6-8fab-6e2b81746b60-kube-api-access-mzkcb\") pod \"lws-controller-manager-56d8f7c9b7-5wj6m\" (UID: \"e69badf5-d10f-40d6-8fab-6e2b81746b60\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m" Apr 20 07:58:09.050808 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:09.050684 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e69badf5-d10f-40d6-8fab-6e2b81746b60-cert\") pod \"lws-controller-manager-56d8f7c9b7-5wj6m\" (UID: \"e69badf5-d10f-40d6-8fab-6e2b81746b60\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m" Apr 20 07:58:09.050808 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:09.050699 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e69badf5-d10f-40d6-8fab-6e2b81746b60-metrics-cert\") pod \"lws-controller-manager-56d8f7c9b7-5wj6m\" (UID: \"e69badf5-d10f-40d6-8fab-6e2b81746b60\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m" Apr 20 07:58:09.051193 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:09.051174 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e69badf5-d10f-40d6-8fab-6e2b81746b60-manager-config\") pod \"lws-controller-manager-56d8f7c9b7-5wj6m\" (UID: \"e69badf5-d10f-40d6-8fab-6e2b81746b60\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m" Apr 20 07:58:09.053139 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:09.053112 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e69badf5-d10f-40d6-8fab-6e2b81746b60-metrics-cert\") pod \"lws-controller-manager-56d8f7c9b7-5wj6m\" (UID: \"e69badf5-d10f-40d6-8fab-6e2b81746b60\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m" Apr 20 07:58:09.053237 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:09.053151 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e69badf5-d10f-40d6-8fab-6e2b81746b60-cert\") pod \"lws-controller-manager-56d8f7c9b7-5wj6m\" (UID: \"e69badf5-d10f-40d6-8fab-6e2b81746b60\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m" Apr 20 07:58:09.070338 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:09.070313 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzkcb\" (UniqueName: \"kubernetes.io/projected/e69badf5-d10f-40d6-8fab-6e2b81746b60-kube-api-access-mzkcb\") pod \"lws-controller-manager-56d8f7c9b7-5wj6m\" (UID: \"e69badf5-d10f-40d6-8fab-6e2b81746b60\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m" Apr 20 07:58:09.146131 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:09.146097 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m" Apr 20 07:58:09.270554 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:09.270532 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m"] Apr 20 07:58:09.273116 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:58:09.273089 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode69badf5_d10f_40d6_8fab_6e2b81746b60.slice/crio-1dfd4aa9ad7a6e6ff3c577f3bb423b60afcf247e724504e863a1dce5bd704b77 WatchSource:0}: Error finding container 1dfd4aa9ad7a6e6ff3c577f3bb423b60afcf247e724504e863a1dce5bd704b77: Status 404 returned error can't find the container with id 1dfd4aa9ad7a6e6ff3c577f3bb423b60afcf247e724504e863a1dce5bd704b77 Apr 20 07:58:09.738458 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:09.738424 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m" event={"ID":"e69badf5-d10f-40d6-8fab-6e2b81746b60","Type":"ContainerStarted","Data":"1dfd4aa9ad7a6e6ff3c577f3bb423b60afcf247e724504e863a1dce5bd704b77"} Apr 20 07:58:11.443302 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.443253 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-7b9784c649-q6g22"] Apr 20 07:58:11.447254 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.447217 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7b9784c649-q6g22" Apr 20 07:58:11.451317 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.451131 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 07:58:11.451317 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.451167 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 07:58:11.451317 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.451175 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 07:58:11.451317 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.451225 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 07:58:11.451317 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.451133 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-k5w9z\"" Apr 20 07:58:11.453964 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.453898 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7b9784c649-q6g22"] Apr 20 07:58:11.574040 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.574002 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eb121239-9981-44fc-80ad-5c6f5b00b76c-tmp\") pod \"kube-auth-proxy-7b9784c649-q6g22\" (UID: \"eb121239-9981-44fc-80ad-5c6f5b00b76c\") " pod="openshift-ingress/kube-auth-proxy-7b9784c649-q6g22" Apr 20 07:58:11.574229 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.574061 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eb121239-9981-44fc-80ad-5c6f5b00b76c-tls-certs\") pod \"kube-auth-proxy-7b9784c649-q6g22\" (UID: \"eb121239-9981-44fc-80ad-5c6f5b00b76c\") " pod="openshift-ingress/kube-auth-proxy-7b9784c649-q6g22" Apr 20 07:58:11.574229 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.574124 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-258qm\" (UniqueName: \"kubernetes.io/projected/eb121239-9981-44fc-80ad-5c6f5b00b76c-kube-api-access-258qm\") pod \"kube-auth-proxy-7b9784c649-q6g22\" (UID: \"eb121239-9981-44fc-80ad-5c6f5b00b76c\") " pod="openshift-ingress/kube-auth-proxy-7b9784c649-q6g22" Apr 20 07:58:11.674732 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.674688 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-258qm\" (UniqueName: \"kubernetes.io/projected/eb121239-9981-44fc-80ad-5c6f5b00b76c-kube-api-access-258qm\") pod \"kube-auth-proxy-7b9784c649-q6g22\" (UID: \"eb121239-9981-44fc-80ad-5c6f5b00b76c\") " pod="openshift-ingress/kube-auth-proxy-7b9784c649-q6g22" Apr 20 07:58:11.674931 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.674775 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eb121239-9981-44fc-80ad-5c6f5b00b76c-tmp\") pod \"kube-auth-proxy-7b9784c649-q6g22\" (UID: \"eb121239-9981-44fc-80ad-5c6f5b00b76c\") " pod="openshift-ingress/kube-auth-proxy-7b9784c649-q6g22" Apr 20 07:58:11.674931 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.674825 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eb121239-9981-44fc-80ad-5c6f5b00b76c-tls-certs\") pod \"kube-auth-proxy-7b9784c649-q6g22\" (UID: \"eb121239-9981-44fc-80ad-5c6f5b00b76c\") " pod="openshift-ingress/kube-auth-proxy-7b9784c649-q6g22" Apr 20 07:58:11.677037 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.677008 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eb121239-9981-44fc-80ad-5c6f5b00b76c-tmp\") pod \"kube-auth-proxy-7b9784c649-q6g22\" (UID: \"eb121239-9981-44fc-80ad-5c6f5b00b76c\") " pod="openshift-ingress/kube-auth-proxy-7b9784c649-q6g22" Apr 20 07:58:11.677341 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.677319 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eb121239-9981-44fc-80ad-5c6f5b00b76c-tls-certs\") pod \"kube-auth-proxy-7b9784c649-q6g22\" (UID: \"eb121239-9981-44fc-80ad-5c6f5b00b76c\") " pod="openshift-ingress/kube-auth-proxy-7b9784c649-q6g22" Apr 20 07:58:11.681988 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.681970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-258qm\" (UniqueName: \"kubernetes.io/projected/eb121239-9981-44fc-80ad-5c6f5b00b76c-kube-api-access-258qm\") pod \"kube-auth-proxy-7b9784c649-q6g22\" (UID: \"eb121239-9981-44fc-80ad-5c6f5b00b76c\") " pod="openshift-ingress/kube-auth-proxy-7b9784c649-q6g22" Apr 20 07:58:11.747508 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.747422 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m" event={"ID":"e69badf5-d10f-40d6-8fab-6e2b81746b60","Type":"ContainerStarted","Data":"5d637294088ec28e42d05a893d1e1264b6f0f6d5c85ff7b47447fd8bd14ee7cd"} Apr 20 07:58:11.747689 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.747531 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m" Apr 20 07:58:11.759308 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.759284 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7b9784c649-q6g22" Apr 20 07:58:11.764950 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.764908 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m" podStartSLOduration=1.973263793 podStartE2EDuration="3.76489481s" podCreationTimestamp="2026-04-20 07:58:08 +0000 UTC" firstStartedPulling="2026-04-20 07:58:09.274948388 +0000 UTC m=+486.448275796" lastFinishedPulling="2026-04-20 07:58:11.066579403 +0000 UTC m=+488.239906813" observedRunningTime="2026-04-20 07:58:11.762744083 +0000 UTC m=+488.936071525" watchObservedRunningTime="2026-04-20 07:58:11.76489481 +0000 UTC m=+488.938222239" Apr 20 07:58:11.882541 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:11.882516 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7b9784c649-q6g22"] Apr 20 07:58:11.884570 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:58:11.884543 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb121239_9981_44fc_80ad_5c6f5b00b76c.slice/crio-75548b888b280465aa4a306e989ebf7368fd605a2f2ac2030c0517b2505b2e57 WatchSource:0}: Error finding container 75548b888b280465aa4a306e989ebf7368fd605a2f2ac2030c0517b2505b2e57: Status 404 returned error can't find the container with id 75548b888b280465aa4a306e989ebf7368fd605a2f2ac2030c0517b2505b2e57 Apr 20 07:58:12.754027 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:12.753959 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7b9784c649-q6g22" event={"ID":"eb121239-9981-44fc-80ad-5c6f5b00b76c","Type":"ContainerStarted","Data":"75548b888b280465aa4a306e989ebf7368fd605a2f2ac2030c0517b2505b2e57"} Apr 20 07:58:15.765902 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:15.765868 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7b9784c649-q6g22" event={"ID":"eb121239-9981-44fc-80ad-5c6f5b00b76c","Type":"ContainerStarted","Data":"0b5e3fd8a9c7d0a6a492371da3665d2e6256b7339fc8ee7c692363d1336a6362"} Apr 20 07:58:15.782419 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:15.782373 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-7b9784c649-q6g22" podStartSLOduration=1.648744335 podStartE2EDuration="4.782358924s" podCreationTimestamp="2026-04-20 07:58:11 +0000 UTC" firstStartedPulling="2026-04-20 07:58:11.8863582 +0000 UTC m=+489.059685606" lastFinishedPulling="2026-04-20 07:58:15.019972788 +0000 UTC m=+492.193300195" observedRunningTime="2026-04-20 07:58:15.780818864 +0000 UTC m=+492.954146301" watchObservedRunningTime="2026-04-20 07:58:15.782358924 +0000 UTC m=+492.955686400" Apr 20 07:58:22.756552 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:58:22.756518 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-5wj6m" Apr 20 07:59:51.666039 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:59:51.665998 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp"] Apr 20 07:59:51.669366 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:59:51.669350 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp" Apr 20 07:59:51.672044 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:59:51.672013 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 07:59:51.672179 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:59:51.672163 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 07:59:51.673207 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:59:51.673192 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-sdshz\"" Apr 20 07:59:51.677100 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:59:51.677081 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp"] Apr 20 07:59:51.694415 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:59:51.694389 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnvb4\" (UniqueName: \"kubernetes.io/projected/a71d4fa7-e04a-491d-9c0f-a91f444e9d83-kube-api-access-dnvb4\") pod \"limitador-operator-controller-manager-85c4996f8c-9twjp\" (UID: \"a71d4fa7-e04a-491d-9c0f-a91f444e9d83\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp" Apr 20 07:59:51.795252 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:59:51.795219 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnvb4\" (UniqueName: \"kubernetes.io/projected/a71d4fa7-e04a-491d-9c0f-a91f444e9d83-kube-api-access-dnvb4\") pod \"limitador-operator-controller-manager-85c4996f8c-9twjp\" (UID: \"a71d4fa7-e04a-491d-9c0f-a91f444e9d83\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp" Apr 20 07:59:51.805685 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:59:51.805659 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnvb4\" (UniqueName: \"kubernetes.io/projected/a71d4fa7-e04a-491d-9c0f-a91f444e9d83-kube-api-access-dnvb4\") pod \"limitador-operator-controller-manager-85c4996f8c-9twjp\" (UID: \"a71d4fa7-e04a-491d-9c0f-a91f444e9d83\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp" Apr 20 07:59:51.979934 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:59:51.979828 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp" Apr 20 07:59:52.106355 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:59:52.106312 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp"] Apr 20 07:59:52.109574 ip-10-0-129-24 kubenswrapper[2575]: W0420 07:59:52.109542 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda71d4fa7_e04a_491d_9c0f_a91f444e9d83.slice/crio-4db5f567068ab1be061fa5016b4f46cbda97e3afa786ae94f6f21bf635447b89 WatchSource:0}: Error finding container 4db5f567068ab1be061fa5016b4f46cbda97e3afa786ae94f6f21bf635447b89: Status 404 returned error can't find the container with id 4db5f567068ab1be061fa5016b4f46cbda97e3afa786ae94f6f21bf635447b89 Apr 20 07:59:53.080002 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:59:53.079959 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp" event={"ID":"a71d4fa7-e04a-491d-9c0f-a91f444e9d83","Type":"ContainerStarted","Data":"4db5f567068ab1be061fa5016b4f46cbda97e3afa786ae94f6f21bf635447b89"} Apr 20 07:59:54.084847 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:59:54.084799 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp" event={"ID":"a71d4fa7-e04a-491d-9c0f-a91f444e9d83","Type":"ContainerStarted","Data":"cad05883f9991a2600b931167f103b3f1637fcb05ae7dd68cd955e03246fdd3d"} Apr 20 07:59:54.085214 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:59:54.084939 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp" Apr 20 07:59:54.100951 ip-10-0-129-24 kubenswrapper[2575]: I0420 07:59:54.100903 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp" podStartSLOduration=1.253652152 podStartE2EDuration="3.100889622s" podCreationTimestamp="2026-04-20 07:59:51 +0000 UTC" firstStartedPulling="2026-04-20 07:59:52.111463567 +0000 UTC m=+589.284790974" lastFinishedPulling="2026-04-20 07:59:53.958701037 +0000 UTC m=+591.132028444" observedRunningTime="2026-04-20 07:59:54.099335535 +0000 UTC m=+591.272662975" watchObservedRunningTime="2026-04-20 07:59:54.100889622 +0000 UTC m=+591.274217050" Apr 20 08:00:02.355734 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.355697 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp"] Apr 20 08:00:02.356234 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.355962 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp" podUID="a71d4fa7-e04a-491d-9c0f-a91f444e9d83" containerName="manager" containerID="cri-o://cad05883f9991a2600b931167f103b3f1637fcb05ae7dd68cd955e03246fdd3d" gracePeriod=2 Apr 20 08:00:02.357533 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.357508 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp" Apr 20 08:00:02.362804 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.362780 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp"] Apr 20 08:00:02.365765 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.365722 2575 status_manager.go:895] "Failed to get status for pod" podUID="a71d4fa7-e04a-491d-9c0f-a91f444e9d83" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp" err="pods \"limitador-operator-controller-manager-85c4996f8c-9twjp\" is forbidden: User \"system:node:ip-10-0-129-24.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-24.ec2.internal' and this object" Apr 20 08:00:02.376476 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.376452 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sz9jn"] Apr 20 08:00:02.376811 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.376795 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a71d4fa7-e04a-491d-9c0f-a91f444e9d83" containerName="manager" Apr 20 08:00:02.376811 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.376812 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a71d4fa7-e04a-491d-9c0f-a91f444e9d83" containerName="manager" Apr 20 08:00:02.376969 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.376886 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a71d4fa7-e04a-491d-9c0f-a91f444e9d83" containerName="manager" Apr 20 08:00:02.379847 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.379830 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sz9jn" Apr 20 08:00:02.382039 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.382012 2575 status_manager.go:895] "Failed to get status for pod" podUID="a71d4fa7-e04a-491d-9c0f-a91f444e9d83" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp" err="pods \"limitador-operator-controller-manager-85c4996f8c-9twjp\" is forbidden: User \"system:node:ip-10-0-129-24.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-24.ec2.internal' and this object" Apr 20 08:00:02.390283 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.390238 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sz9jn"] Apr 20 08:00:02.483284 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.483252 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9tq2\" (UniqueName: \"kubernetes.io/projected/c753973c-52b8-4bc4-8525-997c6a67bde6-kube-api-access-k9tq2\") pod \"limitador-operator-controller-manager-85c4996f8c-sz9jn\" (UID: \"c753973c-52b8-4bc4-8525-997c6a67bde6\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sz9jn" Apr 20 08:00:02.583915 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.583881 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9tq2\" (UniqueName: \"kubernetes.io/projected/c753973c-52b8-4bc4-8525-997c6a67bde6-kube-api-access-k9tq2\") pod \"limitador-operator-controller-manager-85c4996f8c-sz9jn\" (UID: \"c753973c-52b8-4bc4-8525-997c6a67bde6\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sz9jn" Apr 20 08:00:02.585088 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.585071 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp" Apr 20 08:00:02.587289 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.587254 2575 status_manager.go:895] "Failed to get status for pod" podUID="a71d4fa7-e04a-491d-9c0f-a91f444e9d83" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp" err="pods \"limitador-operator-controller-manager-85c4996f8c-9twjp\" is forbidden: User \"system:node:ip-10-0-129-24.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-24.ec2.internal' and this object" Apr 20 08:00:02.591664 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.591642 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9tq2\" (UniqueName: \"kubernetes.io/projected/c753973c-52b8-4bc4-8525-997c6a67bde6-kube-api-access-k9tq2\") pod \"limitador-operator-controller-manager-85c4996f8c-sz9jn\" (UID: \"c753973c-52b8-4bc4-8525-997c6a67bde6\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sz9jn" Apr 20 08:00:02.684871 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.684827 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnvb4\" (UniqueName: \"kubernetes.io/projected/a71d4fa7-e04a-491d-9c0f-a91f444e9d83-kube-api-access-dnvb4\") pod \"a71d4fa7-e04a-491d-9c0f-a91f444e9d83\" (UID: \"a71d4fa7-e04a-491d-9c0f-a91f444e9d83\") " Apr 20 08:00:02.686971 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.686939 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a71d4fa7-e04a-491d-9c0f-a91f444e9d83-kube-api-access-dnvb4" (OuterVolumeSpecName: "kube-api-access-dnvb4") pod "a71d4fa7-e04a-491d-9c0f-a91f444e9d83" (UID: "a71d4fa7-e04a-491d-9c0f-a91f444e9d83"). InnerVolumeSpecName "kube-api-access-dnvb4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 08:00:02.749231 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.749191 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sz9jn" Apr 20 08:00:02.786225 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.786183 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dnvb4\" (UniqueName: \"kubernetes.io/projected/a71d4fa7-e04a-491d-9c0f-a91f444e9d83-kube-api-access-dnvb4\") on node \"ip-10-0-129-24.ec2.internal\" DevicePath \"\"" Apr 20 08:00:02.874441 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:02.874415 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sz9jn"] Apr 20 08:00:02.877082 ip-10-0-129-24 kubenswrapper[2575]: W0420 08:00:02.877056 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc753973c_52b8_4bc4_8525_997c6a67bde6.slice/crio-7d5bce25498b99d697bbe4277e80afc1628334fb783c36180a08b5596dcf8403 WatchSource:0}: Error finding container 7d5bce25498b99d697bbe4277e80afc1628334fb783c36180a08b5596dcf8403: Status 404 returned error can't find the container with id 7d5bce25498b99d697bbe4277e80afc1628334fb783c36180a08b5596dcf8403 Apr 20 08:00:03.116461 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:03.116424 2575 generic.go:358] "Generic (PLEG): container finished" podID="a71d4fa7-e04a-491d-9c0f-a91f444e9d83" containerID="cad05883f9991a2600b931167f103b3f1637fcb05ae7dd68cd955e03246fdd3d" exitCode=0 Apr 20 08:00:03.116667 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:03.116472 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp" Apr 20 08:00:03.116667 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:03.116520 2575 scope.go:117] "RemoveContainer" containerID="cad05883f9991a2600b931167f103b3f1637fcb05ae7dd68cd955e03246fdd3d" Apr 20 08:00:03.118292 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:03.118254 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sz9jn" event={"ID":"c753973c-52b8-4bc4-8525-997c6a67bde6","Type":"ContainerStarted","Data":"95a268da018607714cbd7bdcc469e2799754ed0ec5237c2309d595d8fd333e7c"} Apr 20 08:00:03.118292 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:03.118291 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sz9jn" event={"ID":"c753973c-52b8-4bc4-8525-997c6a67bde6","Type":"ContainerStarted","Data":"7d5bce25498b99d697bbe4277e80afc1628334fb783c36180a08b5596dcf8403"} Apr 20 08:00:03.118785 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:03.118377 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sz9jn" Apr 20 08:00:03.118856 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:03.118836 2575 status_manager.go:895] "Failed to get status for pod" podUID="a71d4fa7-e04a-491d-9c0f-a91f444e9d83" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp" err="pods \"limitador-operator-controller-manager-85c4996f8c-9twjp\" is forbidden: User \"system:node:ip-10-0-129-24.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-24.ec2.internal' and this object" Apr 20 08:00:03.125132 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:03.125111 2575 scope.go:117] "RemoveContainer" containerID="cad05883f9991a2600b931167f103b3f1637fcb05ae7dd68cd955e03246fdd3d" Apr 20 08:00:03.125399 ip-10-0-129-24 kubenswrapper[2575]: E0420 08:00:03.125374 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cad05883f9991a2600b931167f103b3f1637fcb05ae7dd68cd955e03246fdd3d\": container with ID starting with cad05883f9991a2600b931167f103b3f1637fcb05ae7dd68cd955e03246fdd3d not found: ID does not exist" containerID="cad05883f9991a2600b931167f103b3f1637fcb05ae7dd68cd955e03246fdd3d" Apr 20 08:00:03.125462 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:03.125412 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cad05883f9991a2600b931167f103b3f1637fcb05ae7dd68cd955e03246fdd3d"} err="failed to get container status \"cad05883f9991a2600b931167f103b3f1637fcb05ae7dd68cd955e03246fdd3d\": rpc error: code = NotFound desc = could not find container \"cad05883f9991a2600b931167f103b3f1637fcb05ae7dd68cd955e03246fdd3d\": container with ID starting with cad05883f9991a2600b931167f103b3f1637fcb05ae7dd68cd955e03246fdd3d not found: ID does not exist" Apr 20 08:00:03.146056 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:03.146004 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sz9jn" podStartSLOduration=1.145989185 podStartE2EDuration="1.145989185s" podCreationTimestamp="2026-04-20 08:00:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 08:00:03.145662066 +0000 UTC m=+600.318989496" watchObservedRunningTime="2026-04-20 08:00:03.145989185 +0000 UTC m=+600.319316697" Apr 20 08:00:03.147875 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:03.147824 2575 status_manager.go:895] "Failed to get status for pod" podUID="a71d4fa7-e04a-491d-9c0f-a91f444e9d83" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp" err="pods \"limitador-operator-controller-manager-85c4996f8c-9twjp\" is forbidden: User \"system:node:ip-10-0-129-24.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-24.ec2.internal' and this object" Apr 20 08:00:03.150090 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:03.150062 2575 status_manager.go:895] "Failed to get status for pod" podUID="a71d4fa7-e04a-491d-9c0f-a91f444e9d83" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp" err="pods \"limitador-operator-controller-manager-85c4996f8c-9twjp\" is forbidden: User \"system:node:ip-10-0-129-24.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-24.ec2.internal' and this object" Apr 20 08:00:03.264319 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:03.264238 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovn-acl-logging/0.log" Apr 20 08:00:03.265655 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:03.265604 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovn-acl-logging/0.log" Apr 20 08:00:03.290587 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:03.290554 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a71d4fa7-e04a-491d-9c0f-a91f444e9d83" path="/var/lib/kubelet/pods/a71d4fa7-e04a-491d-9c0f-a91f444e9d83/volumes" Apr 20 08:00:03.294413 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:03.294387 2575 status_manager.go:895] "Failed to get status for pod" podUID="a71d4fa7-e04a-491d-9c0f-a91f444e9d83" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-9twjp" err="pods \"limitador-operator-controller-manager-85c4996f8c-9twjp\" is forbidden: User \"system:node:ip-10-0-129-24.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-24.ec2.internal' and this object" Apr 20 08:00:14.124574 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:14.124539 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sz9jn" Apr 20 08:00:41.172697 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:41.172661 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:00:41.176709 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:41.176685 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-m8r5r" Apr 20 08:00:41.179188 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:41.179163 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 08:00:41.179332 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:41.179313 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-7v55j\"" Apr 20 08:00:41.182661 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:41.182637 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:00:41.197776 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:41.197746 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:00:41.308875 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:41.308838 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6518e063-ceed-4c2b-9365-cdc66c4ecd37-config-file\") pod \"limitador-limitador-78c99df468-m8r5r\" (UID: \"6518e063-ceed-4c2b-9365-cdc66c4ecd37\") " pod="kuadrant-system/limitador-limitador-78c99df468-m8r5r" Apr 20 08:00:41.309056 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:41.308892 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4spxr\" (UniqueName: \"kubernetes.io/projected/6518e063-ceed-4c2b-9365-cdc66c4ecd37-kube-api-access-4spxr\") pod \"limitador-limitador-78c99df468-m8r5r\" (UID: \"6518e063-ceed-4c2b-9365-cdc66c4ecd37\") " pod="kuadrant-system/limitador-limitador-78c99df468-m8r5r" Apr 20 08:00:41.410164 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:41.410124 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4spxr\" (UniqueName: \"kubernetes.io/projected/6518e063-ceed-4c2b-9365-cdc66c4ecd37-kube-api-access-4spxr\") pod \"limitador-limitador-78c99df468-m8r5r\" (UID: \"6518e063-ceed-4c2b-9365-cdc66c4ecd37\") " pod="kuadrant-system/limitador-limitador-78c99df468-m8r5r" Apr 20 08:00:41.410329 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:41.410212 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6518e063-ceed-4c2b-9365-cdc66c4ecd37-config-file\") pod \"limitador-limitador-78c99df468-m8r5r\" (UID: \"6518e063-ceed-4c2b-9365-cdc66c4ecd37\") " pod="kuadrant-system/limitador-limitador-78c99df468-m8r5r" Apr 20 08:00:41.410804 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:41.410785 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6518e063-ceed-4c2b-9365-cdc66c4ecd37-config-file\") pod \"limitador-limitador-78c99df468-m8r5r\" (UID: \"6518e063-ceed-4c2b-9365-cdc66c4ecd37\") " pod="kuadrant-system/limitador-limitador-78c99df468-m8r5r" Apr 20 08:00:41.418191 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:41.418161 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4spxr\" (UniqueName: \"kubernetes.io/projected/6518e063-ceed-4c2b-9365-cdc66c4ecd37-kube-api-access-4spxr\") pod \"limitador-limitador-78c99df468-m8r5r\" (UID: \"6518e063-ceed-4c2b-9365-cdc66c4ecd37\") " pod="kuadrant-system/limitador-limitador-78c99df468-m8r5r" Apr 20 08:00:41.488010 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:41.487923 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-m8r5r" Apr 20 08:00:41.608944 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:41.608923 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:00:41.611087 ip-10-0-129-24 kubenswrapper[2575]: W0420 08:00:41.611046 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6518e063_ceed_4c2b_9365_cdc66c4ecd37.slice/crio-33d9658f048df17c901c81f1bbae507cf855096edfd087b90459623af9754a6a WatchSource:0}: Error finding container 33d9658f048df17c901c81f1bbae507cf855096edfd087b90459623af9754a6a: Status 404 returned error can't find the container with id 33d9658f048df17c901c81f1bbae507cf855096edfd087b90459623af9754a6a Apr 20 08:00:42.254234 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:42.254181 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-m8r5r" event={"ID":"6518e063-ceed-4c2b-9365-cdc66c4ecd37","Type":"ContainerStarted","Data":"33d9658f048df17c901c81f1bbae507cf855096edfd087b90459623af9754a6a"} Apr 20 08:00:45.273718 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:45.273678 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-m8r5r" event={"ID":"6518e063-ceed-4c2b-9365-cdc66c4ecd37","Type":"ContainerStarted","Data":"71ca0d62013612749e2ce3c7ca82f45cc5ff525a74796185e3421479b2a3c3c9"} Apr 20 08:00:45.274135 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:45.273737 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-m8r5r" Apr 20 08:00:45.288934 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:45.288879 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-m8r5r" podStartSLOduration=1.6898435969999999 podStartE2EDuration="4.288866751s" podCreationTimestamp="2026-04-20 08:00:41 +0000 UTC" firstStartedPulling="2026-04-20 08:00:41.612859431 +0000 UTC m=+638.786186838" lastFinishedPulling="2026-04-20 08:00:44.211882584 +0000 UTC m=+641.385209992" observedRunningTime="2026-04-20 08:00:45.288237338 +0000 UTC m=+642.461564782" watchObservedRunningTime="2026-04-20 08:00:45.288866751 +0000 UTC m=+642.462194180" Apr 20 08:00:56.278152 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:00:56.278119 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-m8r5r" Apr 20 08:01:23.453167 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:01:23.453128 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:02:05.903042 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:02:05.902959 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:02:16.602143 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:02:16.602103 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:02:19.097785 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:02:19.097750 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:02:37.396979 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:02:37.396946 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:02:46.412936 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:02:46.412893 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:02:53.701038 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:02:53.701006 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:04:01.404365 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:04:01.404332 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:04:12.607403 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:04:12.607366 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:04:20.801255 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:04:20.801216 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:04:31.504208 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:04:31.504167 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:04:39.905023 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:04:39.904983 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:04:50.699020 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:04:50.698938 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:05:03.292570 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:05:03.292541 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovn-acl-logging/0.log" Apr 20 08:05:03.294653 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:05:03.294631 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovn-acl-logging/0.log" Apr 20 08:05:52.598167 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:05:52.598126 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:06:08.790648 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:06:08.790595 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:06:46.601012 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:06:46.600978 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:07:03.196363 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:07:03.196327 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:07:17.903530 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:07:17.903493 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:07:34.400369 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:07:34.400331 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:07:38.499161 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:07:38.499127 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:08:28.197918 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:08:28.197882 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:08:37.506282 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:08:37.506241 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:08:53.806154 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:08:53.806118 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:09:03.395977 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:09:03.395941 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:09:19.902486 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:09:19.902451 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:09:28.003349 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:09:28.003313 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:10:01.198299 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:10:01.198262 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:10:03.314401 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:10:03.314365 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovn-acl-logging/0.log" Apr 20 08:10:03.317510 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:10:03.317490 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovn-acl-logging/0.log" Apr 20 08:10:09.295441 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:10:09.295410 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:10:13.193899 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:10:13.193864 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:10:17.497017 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:10:17.496980 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:10:26.700351 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:10:26.700312 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:10:35.301365 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:10:35.301327 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:10:51.497405 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:10:51.497328 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:11:03.000945 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:11:03.000908 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:11:46.095165 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:11:46.095129 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:11:49.997358 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:11:49.997321 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:12:00.506448 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:12:00.506409 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:12:09.199721 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:12:09.199686 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:12:18.598498 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:12:18.598457 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:12:27.806717 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:12:27.806635 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:12:38.802839 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:12:38.802796 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:12:46.807248 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:12:46.807206 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:12:56.802129 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:12:56.802096 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:13:05.005385 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:13:05.005350 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:13:14.901469 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:13:14.901433 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:13:23.399562 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:13:23.399525 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:13:33.596194 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:13:33.596159 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:13:43.194500 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:13:43.194462 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:13:53.201547 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:13:53.201465 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:14:01.294715 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:14:01.294681 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:14:11.500250 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:14:11.500214 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:14:20.201230 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:14:20.201191 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:14:30.000730 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:14:30.000696 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:14:38.900567 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:14:38.900532 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:14:49.306220 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:14:49.306186 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:14:54.096512 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:14:54.096469 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:15:03.338646 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:15:03.338604 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovn-acl-logging/0.log" Apr 20 08:15:03.340236 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:15:03.340211 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovn-acl-logging/0.log" Apr 20 08:15:29.196392 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:15:29.196308 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:15:33.300753 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:15:33.300717 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:17:04.408184 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:17:04.408098 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:17:09.801457 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:17:09.801419 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:17:37.900994 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:17:37.900959 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:17:44.701883 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:17:44.701846 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:17:54.903971 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:17:54.903934 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:18:05.596554 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:18:05.596517 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:18:14.101133 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:18:14.101097 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:18:26.201631 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:18:26.201532 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:18:34.798385 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:18:34.798346 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:18:45.401581 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:18:45.401545 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:18:54.198888 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:18:54.198845 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:19:05.100717 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:19:05.100682 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:19:14.205781 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:19:14.205744 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:19:19.699652 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:19:19.699604 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:19:47.399678 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:19:47.399633 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:20:03.360457 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:20:03.360429 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovn-acl-logging/0.log" Apr 20 08:20:03.367935 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:20:03.367910 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovn-acl-logging/0.log" Apr 20 08:20:34.500315 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:20:34.500277 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:20:43.298180 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:20:43.298141 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:20:52.305800 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:20:52.305766 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:21:00.597913 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:21:00.597882 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:21:11.502280 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:21:11.502242 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:21:21.704804 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:21:21.704726 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:21:31.504664 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:21:31.504631 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:21:42.803734 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:21:42.803695 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:21:52.507655 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:21:52.507608 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:22:00.711652 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:22:00.711602 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:22:21.403602 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:22:21.403567 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:22:35.496132 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:22:35.496090 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:22:54.908037 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:22:54.907954 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:23:04.305599 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:23:04.305566 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:23:14.107899 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:23:14.107861 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:23:22.705133 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:23:22.705091 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:23:40.415257 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:23:40.413200 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:23:48.402877 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:23:48.402843 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:23:58.595727 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:23:58.595690 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:24:06.109189 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:24:06.109151 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:24:15.402045 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:24:15.402012 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:24:23.805018 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:24:23.804928 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:24:33.999933 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:24:33.999895 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:24:43.906419 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:24:43.906384 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:24:52.800104 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:24:52.800065 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:25:03.388366 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:25:03.388340 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovn-acl-logging/0.log" Apr 20 08:25:03.390796 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:25:03.390772 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovn-acl-logging/0.log" Apr 20 08:25:05.899925 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:25:05.899890 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:25:14.809267 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:25:14.809230 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:25:23.507877 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:25:23.507835 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:25:31.505933 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:25:31.505897 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:25:39.703367 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:25:39.703332 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:25:56.204154 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:25:56.204075 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:26:04.501876 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:26:04.501838 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:26:13.609885 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:26:13.609850 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:26:22.103246 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:26:22.103206 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:26:46.308705 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:26:46.308672 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:26:58.405932 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:26:58.405895 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-m8r5r"] Apr 20 08:27:04.803182 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:04.803149 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-687c889b9-cxk5c_98cc77c3-3ed4-49be-914d-ae2b1d05c5dd/manager/0.log" Apr 20 08:27:07.291847 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:07.291820 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-m8r5r_6518e063-ceed-4c2b-9365-cdc66c4ecd37/limitador/0.log" Apr 20 08:27:07.420155 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:07.420121 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-sz9jn_c753973c-52b8-4bc4-8525-997c6a67bde6/manager/0.log" Apr 20 08:27:08.178502 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:08.178469 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7b9784c649-q6g22_eb121239-9981-44fc-80ad-5c6f5b00b76c/kube-auth-proxy/0.log" Apr 20 08:27:13.945522 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:13.945488 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mvtfk/must-gather-7789r"] Apr 20 08:27:13.948977 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:13.948960 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mvtfk/must-gather-7789r" Apr 20 08:27:13.951589 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:13.951566 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-mvtfk\"/\"default-dockercfg-qxwb7\"" Apr 20 08:27:13.952865 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:13.952845 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mvtfk\"/\"kube-root-ca.crt\"" Apr 20 08:27:13.952981 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:13.952905 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mvtfk\"/\"openshift-service-ca.crt\"" Apr 20 08:27:13.966075 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:13.966049 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mvtfk/must-gather-7789r"] Apr 20 08:27:14.069212 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:14.069174 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5ed4312f-daf5-4611-b833-dea854a14715-must-gather-output\") pod \"must-gather-7789r\" (UID: \"5ed4312f-daf5-4611-b833-dea854a14715\") " pod="openshift-must-gather-mvtfk/must-gather-7789r" Apr 20 08:27:14.069395 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:14.069230 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czrhp\" (UniqueName: \"kubernetes.io/projected/5ed4312f-daf5-4611-b833-dea854a14715-kube-api-access-czrhp\") pod \"must-gather-7789r\" (UID: \"5ed4312f-daf5-4611-b833-dea854a14715\") " pod="openshift-must-gather-mvtfk/must-gather-7789r" Apr 20 08:27:14.170503 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:14.170464 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5ed4312f-daf5-4611-b833-dea854a14715-must-gather-output\") pod \"must-gather-7789r\" (UID: \"5ed4312f-daf5-4611-b833-dea854a14715\") " pod="openshift-must-gather-mvtfk/must-gather-7789r" Apr 20 08:27:14.170687 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:14.170527 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czrhp\" (UniqueName: \"kubernetes.io/projected/5ed4312f-daf5-4611-b833-dea854a14715-kube-api-access-czrhp\") pod \"must-gather-7789r\" (UID: \"5ed4312f-daf5-4611-b833-dea854a14715\") " pod="openshift-must-gather-mvtfk/must-gather-7789r" Apr 20 08:27:14.170907 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:14.170882 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5ed4312f-daf5-4611-b833-dea854a14715-must-gather-output\") pod \"must-gather-7789r\" (UID: \"5ed4312f-daf5-4611-b833-dea854a14715\") " pod="openshift-must-gather-mvtfk/must-gather-7789r" Apr 20 08:27:14.178916 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:14.178894 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czrhp\" (UniqueName: \"kubernetes.io/projected/5ed4312f-daf5-4611-b833-dea854a14715-kube-api-access-czrhp\") pod \"must-gather-7789r\" (UID: \"5ed4312f-daf5-4611-b833-dea854a14715\") " pod="openshift-must-gather-mvtfk/must-gather-7789r" Apr 20 08:27:14.257802 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:14.257703 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mvtfk/must-gather-7789r" Apr 20 08:27:14.381772 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:14.381745 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mvtfk/must-gather-7789r"] Apr 20 08:27:14.384447 ip-10-0-129-24 kubenswrapper[2575]: W0420 08:27:14.384420 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ed4312f_daf5_4611_b833_dea854a14715.slice/crio-e2f2ab326eaa427555cad3d39a247649e832248135a0c328bf9df5d8418fc7e1 WatchSource:0}: Error finding container e2f2ab326eaa427555cad3d39a247649e832248135a0c328bf9df5d8418fc7e1: Status 404 returned error can't find the container with id e2f2ab326eaa427555cad3d39a247649e832248135a0c328bf9df5d8418fc7e1 Apr 20 08:27:14.386809 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:14.386790 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 08:27:14.454847 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:14.454815 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mvtfk/must-gather-7789r" event={"ID":"5ed4312f-daf5-4611-b833-dea854a14715","Type":"ContainerStarted","Data":"e2f2ab326eaa427555cad3d39a247649e832248135a0c328bf9df5d8418fc7e1"} Apr 20 08:27:16.463866 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:16.463824 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mvtfk/must-gather-7789r" event={"ID":"5ed4312f-daf5-4611-b833-dea854a14715","Type":"ContainerStarted","Data":"1ffc3840faf4a4b65270b720edb8fddcc82adc4f3a5aa0689021b286c3ab161c"} Apr 20 08:27:16.464327 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:16.463894 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mvtfk/must-gather-7789r" event={"ID":"5ed4312f-daf5-4611-b833-dea854a14715","Type":"ContainerStarted","Data":"f052ebff1e23f5bde5a89b481d9e22395db0ced66048160fe97092c7dbdccee1"} Apr 20 08:27:16.479184 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:16.479132 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mvtfk/must-gather-7789r" podStartSLOduration=2.513073657 podStartE2EDuration="3.47911797s" podCreationTimestamp="2026-04-20 08:27:13 +0000 UTC" firstStartedPulling="2026-04-20 08:27:14.386927564 +0000 UTC m=+2231.560254971" lastFinishedPulling="2026-04-20 08:27:15.352971874 +0000 UTC m=+2232.526299284" observedRunningTime="2026-04-20 08:27:16.477979325 +0000 UTC m=+2233.651306755" watchObservedRunningTime="2026-04-20 08:27:16.47911797 +0000 UTC m=+2233.652445428" Apr 20 08:27:16.877744 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:16.877713 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-nv6pl_036e621d-2979-4552-ac5c-6fab5743df3a/global-pull-secret-syncer/0.log" Apr 20 08:27:16.954439 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:16.954398 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-ggtd4_1755049c-9fb4-42bf-8134-d41e0e7a4e97/konnectivity-agent/0.log" Apr 20 08:27:17.026127 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:17.026096 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-24.ec2.internal_5018c020cf992df6097e0aee42f16bf3/haproxy/0.log" Apr 20 08:27:21.024438 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:21.024413 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-m8r5r_6518e063-ceed-4c2b-9365-cdc66c4ecd37/limitador/0.log" Apr 20 08:27:21.106719 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:21.106681 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-sz9jn_c753973c-52b8-4bc4-8525-997c6a67bde6/manager/0.log" Apr 20 08:27:22.695991 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:22.695899 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mt56d_1c403e29-027b-4589-8597-f3313cb7a43d/node-exporter/0.log" Apr 20 08:27:22.720197 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:22.720164 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mt56d_1c403e29-027b-4589-8597-f3313cb7a43d/kube-rbac-proxy/0.log" Apr 20 08:27:22.744088 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:22.744061 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mt56d_1c403e29-027b-4589-8597-f3313cb7a43d/init-textfile/0.log" Apr 20 08:27:23.038014 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:23.037982 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5e53bbd8-7519-4447-aff6-96c8adf1d541/prometheus/0.log" Apr 20 08:27:23.062787 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:23.062753 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5e53bbd8-7519-4447-aff6-96c8adf1d541/config-reloader/0.log" Apr 20 08:27:23.093004 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:23.092974 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5e53bbd8-7519-4447-aff6-96c8adf1d541/thanos-sidecar/0.log" Apr 20 08:27:23.118182 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:23.118140 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5e53bbd8-7519-4447-aff6-96c8adf1d541/kube-rbac-proxy-web/0.log" Apr 20 08:27:23.141738 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:23.141688 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5e53bbd8-7519-4447-aff6-96c8adf1d541/kube-rbac-proxy/0.log" Apr 20 08:27:23.166309 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:23.166282 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5e53bbd8-7519-4447-aff6-96c8adf1d541/kube-rbac-proxy-thanos/0.log" Apr 20 08:27:23.182828 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:23.182798 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5e53bbd8-7519-4447-aff6-96c8adf1d541/init-config-reloader/0.log" Apr 20 08:27:23.251343 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:23.251265 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-6xmbv_27805bcb-303c-42a4-8c37-030c42c57561/prometheus-operator-admission-webhook/0.log" Apr 20 08:27:23.353240 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:23.353213 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67dfbc55cc-xxt72_b749342e-e13a-4969-8047-1f0d69bbbcef/thanos-query/0.log" Apr 20 08:27:23.377003 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:23.376979 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67dfbc55cc-xxt72_b749342e-e13a-4969-8047-1f0d69bbbcef/kube-rbac-proxy-web/0.log" Apr 20 08:27:23.398860 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:23.398809 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67dfbc55cc-xxt72_b749342e-e13a-4969-8047-1f0d69bbbcef/kube-rbac-proxy/0.log" Apr 20 08:27:23.417507 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:23.417473 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67dfbc55cc-xxt72_b749342e-e13a-4969-8047-1f0d69bbbcef/prom-label-proxy/0.log" Apr 20 08:27:23.442466 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:23.442434 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67dfbc55cc-xxt72_b749342e-e13a-4969-8047-1f0d69bbbcef/kube-rbac-proxy-rules/0.log" Apr 20 08:27:23.465166 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:23.465133 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67dfbc55cc-xxt72_b749342e-e13a-4969-8047-1f0d69bbbcef/kube-rbac-proxy-metrics/0.log" Apr 20 08:27:24.684627 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:24.684579 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-smjd7_2e4b6f0f-88ff-49a7-ada1-7af9515863da/networking-console-plugin/0.log" Apr 20 08:27:25.975306 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:25.975267 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc"] Apr 20 08:27:25.982146 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:25.982123 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" Apr 20 08:27:25.996443 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:25.996414 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc"] Apr 20 08:27:26.088963 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:26.088930 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0ce58d16-1698-4d46-b33c-f58367541f9b-proc\") pod \"perf-node-gather-daemonset-9vnqc\" (UID: \"0ce58d16-1698-4d46-b33c-f58367541f9b\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" Apr 20 08:27:26.089138 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:26.089002 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0ce58d16-1698-4d46-b33c-f58367541f9b-sys\") pod \"perf-node-gather-daemonset-9vnqc\" (UID: \"0ce58d16-1698-4d46-b33c-f58367541f9b\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" Apr 20 08:27:26.089138 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:26.089065 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0ce58d16-1698-4d46-b33c-f58367541f9b-lib-modules\") pod \"perf-node-gather-daemonset-9vnqc\" (UID: \"0ce58d16-1698-4d46-b33c-f58367541f9b\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" Apr 20 08:27:26.089138 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:26.089092 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0ce58d16-1698-4d46-b33c-f58367541f9b-podres\") pod \"perf-node-gather-daemonset-9vnqc\" (UID: \"0ce58d16-1698-4d46-b33c-f58367541f9b\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" Apr 20 08:27:26.089265 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:26.089156 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrghk\" (UniqueName: \"kubernetes.io/projected/0ce58d16-1698-4d46-b33c-f58367541f9b-kube-api-access-jrghk\") pod \"perf-node-gather-daemonset-9vnqc\" (UID: \"0ce58d16-1698-4d46-b33c-f58367541f9b\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" Apr 20 08:27:26.190035 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:26.189996 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0ce58d16-1698-4d46-b33c-f58367541f9b-lib-modules\") pod \"perf-node-gather-daemonset-9vnqc\" (UID: \"0ce58d16-1698-4d46-b33c-f58367541f9b\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" Apr 20 08:27:26.190035 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:26.190036 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0ce58d16-1698-4d46-b33c-f58367541f9b-podres\") pod \"perf-node-gather-daemonset-9vnqc\" (UID: \"0ce58d16-1698-4d46-b33c-f58367541f9b\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" Apr 20 08:27:26.190276 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:26.190060 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrghk\" (UniqueName: \"kubernetes.io/projected/0ce58d16-1698-4d46-b33c-f58367541f9b-kube-api-access-jrghk\") pod \"perf-node-gather-daemonset-9vnqc\" (UID: \"0ce58d16-1698-4d46-b33c-f58367541f9b\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" Apr 20 08:27:26.190276 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:26.190086 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0ce58d16-1698-4d46-b33c-f58367541f9b-proc\") pod \"perf-node-gather-daemonset-9vnqc\" (UID: \"0ce58d16-1698-4d46-b33c-f58367541f9b\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" Apr 20 08:27:26.190276 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:26.190131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0ce58d16-1698-4d46-b33c-f58367541f9b-sys\") pod \"perf-node-gather-daemonset-9vnqc\" (UID: \"0ce58d16-1698-4d46-b33c-f58367541f9b\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" Apr 20 08:27:26.190276 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:26.190178 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0ce58d16-1698-4d46-b33c-f58367541f9b-podres\") pod \"perf-node-gather-daemonset-9vnqc\" (UID: \"0ce58d16-1698-4d46-b33c-f58367541f9b\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" Apr 20 08:27:26.190276 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:26.190177 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0ce58d16-1698-4d46-b33c-f58367541f9b-lib-modules\") pod \"perf-node-gather-daemonset-9vnqc\" (UID: \"0ce58d16-1698-4d46-b33c-f58367541f9b\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" Apr 20 08:27:26.190276 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:26.190225 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0ce58d16-1698-4d46-b33c-f58367541f9b-proc\") pod \"perf-node-gather-daemonset-9vnqc\" (UID: \"0ce58d16-1698-4d46-b33c-f58367541f9b\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" Apr 20 08:27:26.190276 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:26.190244 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0ce58d16-1698-4d46-b33c-f58367541f9b-sys\") pod \"perf-node-gather-daemonset-9vnqc\" (UID: \"0ce58d16-1698-4d46-b33c-f58367541f9b\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" Apr 20 08:27:26.197650 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:26.197606 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrghk\" (UniqueName: \"kubernetes.io/projected/0ce58d16-1698-4d46-b33c-f58367541f9b-kube-api-access-jrghk\") pod \"perf-node-gather-daemonset-9vnqc\" (UID: \"0ce58d16-1698-4d46-b33c-f58367541f9b\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" Apr 20 08:27:26.294112 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:26.294035 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" Apr 20 08:27:26.446015 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:26.445941 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc"] Apr 20 08:27:26.449381 ip-10-0-129-24 kubenswrapper[2575]: W0420 08:27:26.449343 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0ce58d16_1698_4d46_b33c_f58367541f9b.slice/crio-3af343160fd5b333d6ae4b3a065a0c63cc34b060ad7573cedec74222d201211a WatchSource:0}: Error finding container 3af343160fd5b333d6ae4b3a065a0c63cc34b060ad7573cedec74222d201211a: Status 404 returned error can't find the container with id 3af343160fd5b333d6ae4b3a065a0c63cc34b060ad7573cedec74222d201211a Apr 20 08:27:26.510634 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:26.507223 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" event={"ID":"0ce58d16-1698-4d46-b33c-f58367541f9b","Type":"ContainerStarted","Data":"3af343160fd5b333d6ae4b3a065a0c63cc34b060ad7573cedec74222d201211a"} Apr 20 08:27:27.104334 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:27.104300 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jp28h_eccac7f6-7962-45d6-9141-b02deb90631f/dns/0.log" Apr 20 08:27:27.124334 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:27.124303 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jp28h_eccac7f6-7962-45d6-9141-b02deb90631f/kube-rbac-proxy/0.log" Apr 20 08:27:27.185704 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:27.185672 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-z54dw_14afe6ef-f671-4736-ba3f-ac6236c30291/dns-node-resolver/0.log" Apr 20 08:27:27.514984 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:27.514949 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" event={"ID":"0ce58d16-1698-4d46-b33c-f58367541f9b","Type":"ContainerStarted","Data":"04cec8512a4b0686decb7c94bcf222babb8d88a2d42645ec22e5784c8262bb93"} Apr 20 08:27:27.515282 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:27.515129 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" Apr 20 08:27:27.539351 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:27.539289 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" podStartSLOduration=2.539270056 podStartE2EDuration="2.539270056s" podCreationTimestamp="2026-04-20 08:27:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 08:27:27.537660786 +0000 UTC m=+2244.710988216" watchObservedRunningTime="2026-04-20 08:27:27.539270056 +0000 UTC m=+2244.712597486" Apr 20 08:27:27.720748 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:27.720717 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-n9944_b7ed9bc9-8f0a-40e4-bd11-25b4f1c3cd39/node-ca/0.log" Apr 20 08:27:28.586758 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:28.586726 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7b9784c649-q6g22_eb121239-9981-44fc-80ad-5c6f5b00b76c/kube-auth-proxy/0.log" Apr 20 08:27:29.162289 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:29.162255 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-mqbmm_5eaa06ed-8495-4d3f-a0e2-d67bb19b8695/serve-healthcheck-canary/0.log" Apr 20 08:27:29.650835 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:29.650802 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n5lkq_3a038fd0-e0d5-4ded-ab07-36ddf6d31d03/kube-rbac-proxy/0.log" Apr 20 08:27:29.668028 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:29.668002 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n5lkq_3a038fd0-e0d5-4ded-ab07-36ddf6d31d03/exporter/0.log" Apr 20 08:27:29.689489 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:29.689460 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n5lkq_3a038fd0-e0d5-4ded-ab07-36ddf6d31d03/extractor/0.log" Apr 20 08:27:31.821756 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:31.821724 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-687c889b9-cxk5c_98cc77c3-3ed4-49be-914d-ae2b1d05c5dd/manager/0.log" Apr 20 08:27:32.969836 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:32.969802 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-56d8f7c9b7-5wj6m_e69badf5-d10f-40d6-8fab-6e2b81746b60/manager/0.log" Apr 20 08:27:32.988821 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:32.988790 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-ncdws_936f0d6f-8aff-449d-a239-9d9ef533103f/openshift-lws-operator/0.log" Apr 20 08:27:33.534067 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:33.534034 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-9vnqc" Apr 20 08:27:37.365318 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:37.365291 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-gjmnh_b167b4a9-fb74-441f-bdfc-f8c71416c3ff/migrator/0.log" Apr 20 08:27:37.382193 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:37.382110 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-gjmnh_b167b4a9-fb74-441f-bdfc-f8c71416c3ff/graceful-termination/0.log" Apr 20 08:27:38.703779 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:38.703745 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7r2r4_04eada51-b824-41d7-8f99-553412d17053/kube-multus/0.log" Apr 20 08:27:38.871385 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:38.871359 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzzk5_ba7d0588-c3eb-4849-ae3b-630be7fcc621/kube-multus-additional-cni-plugins/0.log" Apr 20 08:27:38.886690 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:38.886659 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzzk5_ba7d0588-c3eb-4849-ae3b-630be7fcc621/egress-router-binary-copy/0.log" Apr 20 08:27:38.901846 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:38.901817 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzzk5_ba7d0588-c3eb-4849-ae3b-630be7fcc621/cni-plugins/0.log" Apr 20 08:27:38.917655 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:38.917629 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzzk5_ba7d0588-c3eb-4849-ae3b-630be7fcc621/bond-cni-plugin/0.log" Apr 20 08:27:38.932811 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:38.932780 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzzk5_ba7d0588-c3eb-4849-ae3b-630be7fcc621/routeoverride-cni/0.log" Apr 20 08:27:38.950316 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:38.950286 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzzk5_ba7d0588-c3eb-4849-ae3b-630be7fcc621/whereabouts-cni-bincopy/0.log" Apr 20 08:27:38.967079 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:38.967015 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hzzk5_ba7d0588-c3eb-4849-ae3b-630be7fcc621/whereabouts-cni/0.log" Apr 20 08:27:39.180458 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:39.180430 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7qpdh_46b8deca-b33b-45e1-9131-b47fde192a78/network-metrics-daemon/0.log" Apr 20 08:27:39.194625 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:39.194596 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7qpdh_46b8deca-b33b-45e1-9131-b47fde192a78/kube-rbac-proxy/0.log" Apr 20 08:27:40.330120 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:40.330081 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovn-controller/0.log" Apr 20 08:27:40.346968 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:40.346938 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovn-acl-logging/0.log" Apr 20 08:27:40.358192 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:40.358164 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovn-acl-logging/1.log" Apr 20 08:27:40.376317 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:40.376293 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/kube-rbac-proxy-node/0.log" Apr 20 08:27:40.394986 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:40.394962 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 08:27:40.408288 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:40.408262 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/northd/0.log" Apr 20 08:27:40.425118 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:40.425100 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/nbdb/0.log" Apr 20 08:27:40.442540 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:40.442518 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/sbdb/0.log" Apr 20 08:27:40.552113 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:40.552079 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-crn8d_f3a24a42-45b7-4726-bc7c-32d9f9d61eaf/ovnkube-controller/0.log" Apr 20 08:27:41.832958 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:41.832931 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-84g2g_66892cf2-e8c0-4ea8-b96d-237bfbb843f4/check-endpoints/0.log" Apr 20 08:27:41.878087 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:41.878056 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-sfl6t_ef4d76cc-35ef-46b3-83b9-ee5fe3bb7a06/network-check-target-container/0.log" Apr 20 08:27:42.892026 ip-10-0-129-24 kubenswrapper[2575]: I0420 08:27:42.891977 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-459v4_f8898642-a099-48c6-ba9f-0a5099d78d5c/iptables-alerter/0.log"