Apr 16 16:47:43.202845 ip-10-0-131-63 systemd[1]: Starting Kubernetes Kubelet... Apr 16 16:47:43.675869 ip-10-0-131-63 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:47:43.675869 ip-10-0-131-63 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 16:47:43.675869 ip-10-0-131-63 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:47:43.675869 ip-10-0-131-63 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 16:47:43.675869 ip-10-0-131-63 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:47:43.678909 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.678818 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 16:47:43.686547 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686506 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:47:43.686547 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686542 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:47:43.686547 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686547 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:47:43.686547 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686551 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:47:43.686547 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686554 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:47:43.686547 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686558 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:47:43.686769 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686561 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:47:43.686769 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686564 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:47:43.686769 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686567 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:47:43.686769 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686570 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:47:43.686769 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686573 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:47:43.686769 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686576 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:47:43.686769 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686578 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:47:43.686769 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686581 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:47:43.686769 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686584 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:47:43.686769 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686587 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:47:43.686769 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686591 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:47:43.686769 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686594 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:47:43.686769 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686597 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:47:43.686769 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686600 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:47:43.686769 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686602 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:47:43.686769 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686605 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:47:43.686769 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686608 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:47:43.686769 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686611 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:47:43.686769 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686613 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:47:43.687216 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686616 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:47:43.687216 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686618 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:47:43.687216 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686621 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:47:43.687216 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686624 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:47:43.687216 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686627 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:47:43.687216 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686629 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:47:43.687216 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686632 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:47:43.687216 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686635 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:47:43.687216 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686637 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:47:43.687216 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686640 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:47:43.687216 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686642 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:47:43.687216 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686645 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:47:43.687216 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686648 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:47:43.687216 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686650 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:47:43.687216 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686655 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:47:43.687216 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686660 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:47:43.687216 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686663 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:47:43.687216 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686666 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:47:43.687216 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686669 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:47:43.687690 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686671 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:47:43.687690 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686674 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:47:43.687690 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686677 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:47:43.687690 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686680 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:47:43.687690 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686682 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:47:43.687690 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686685 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:47:43.687690 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686688 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:47:43.687690 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686691 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:47:43.687690 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686694 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:47:43.687690 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686696 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:47:43.687690 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686699 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:47:43.687690 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686702 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:47:43.687690 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686704 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:47:43.687690 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686708 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:47:43.687690 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686710 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:47:43.687690 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686713 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:47:43.687690 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686716 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:47:43.687690 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686718 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:47:43.687690 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686721 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:47:43.687690 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686724 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:47:43.688154 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686726 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:47:43.688154 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686729 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:47:43.688154 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686731 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:47:43.688154 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686734 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:47:43.688154 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686737 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:47:43.688154 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686739 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:47:43.688154 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686742 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:47:43.688154 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686746 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:47:43.688154 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686749 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:47:43.688154 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686752 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:47:43.688154 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686755 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:47:43.688154 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686758 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:47:43.688154 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686760 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:47:43.688154 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686763 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:47:43.688154 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686768 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:47:43.688154 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686770 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:47:43.688154 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686773 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:47:43.688154 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686776 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:47:43.688154 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686781 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:47:43.688627 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686784 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:47:43.688627 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686787 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:47:43.688627 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.686789 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:47:43.688627 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687197 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:47:43.688627 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687202 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:47:43.688627 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687205 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:47:43.688627 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687208 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:47:43.688627 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687211 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:47:43.688627 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687214 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:47:43.688627 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687217 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:47:43.688627 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687220 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:47:43.688627 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687223 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:47:43.688627 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687226 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:47:43.688627 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687229 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:47:43.688627 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687232 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:47:43.688627 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687234 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:47:43.688627 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687237 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:47:43.688627 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687240 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:47:43.688627 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687242 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:47:43.688627 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687245 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:47:43.689095 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687248 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:47:43.689095 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687251 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:47:43.689095 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687256 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:47:43.689095 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687259 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:47:43.689095 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687262 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:47:43.689095 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687265 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:47:43.689095 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687268 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:47:43.689095 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687271 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:47:43.689095 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687273 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:47:43.689095 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687276 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:47:43.689095 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687279 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:47:43.689095 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687281 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:47:43.689095 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687284 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:47:43.689095 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687287 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:47:43.689095 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687289 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:47:43.689095 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687292 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:47:43.689095 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687294 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:47:43.689095 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687297 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:47:43.689095 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687299 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:47:43.689095 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687302 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:47:43.689608 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687305 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:47:43.689608 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687308 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:47:43.689608 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687310 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:47:43.689608 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687313 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:47:43.689608 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687316 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:47:43.689608 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687319 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:47:43.689608 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687321 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:47:43.689608 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687324 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:47:43.689608 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687327 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:47:43.689608 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687329 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:47:43.689608 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687332 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:47:43.689608 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687334 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:47:43.689608 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687337 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:47:43.689608 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687340 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:47:43.689608 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687342 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:47:43.689608 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687345 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:47:43.689608 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687347 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:47:43.689608 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687350 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:47:43.689608 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687352 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:47:43.690069 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687355 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:47:43.690069 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687358 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:47:43.690069 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687361 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:47:43.690069 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687363 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:47:43.690069 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687366 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:47:43.690069 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687369 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:47:43.690069 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687371 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:47:43.690069 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687374 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:47:43.690069 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687376 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:47:43.690069 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687379 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:47:43.690069 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687381 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:47:43.690069 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687384 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:47:43.690069 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687386 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:47:43.690069 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687388 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:47:43.690069 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687391 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:47:43.690069 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687393 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:47:43.690069 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687397 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:47:43.690069 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687399 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:47:43.690069 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687402 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:47:43.690069 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687405 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:47:43.690572 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687407 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:47:43.690572 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687410 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:47:43.690572 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687412 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:47:43.690572 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687415 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:47:43.690572 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687418 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:47:43.690572 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687421 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:47:43.690572 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687424 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:47:43.690572 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687427 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:47:43.690572 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687431 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:47:43.690572 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.687434 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:47:43.690572 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689289 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 16:47:43.690572 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689300 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 16:47:43.690572 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689311 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 16:47:43.690572 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689316 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 16:47:43.690572 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689321 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 16:47:43.690572 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689325 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 16:47:43.690572 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689330 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 16:47:43.690572 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689334 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 16:47:43.690572 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689338 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 16:47:43.690572 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689341 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689345 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689348 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689352 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689355 2568 flags.go:64] FLAG: --cgroup-root="" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689359 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689362 2568 flags.go:64] FLAG: --client-ca-file="" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689365 2568 flags.go:64] FLAG: --cloud-config="" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689368 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689371 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689376 2568 flags.go:64] FLAG: --cluster-domain="" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689379 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689382 2568 flags.go:64] FLAG: --config-dir="" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689385 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689388 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689393 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689396 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689399 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689403 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689406 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689409 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689412 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689416 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689419 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689424 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 16:47:43.691082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689427 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689430 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689433 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689437 2568 flags.go:64] FLAG: --enable-server="true" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689441 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689446 2568 flags.go:64] FLAG: --event-burst="100" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689449 2568 flags.go:64] FLAG: --event-qps="50" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689452 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689456 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689459 2568 flags.go:64] FLAG: --eviction-hard="" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689463 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689466 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689476 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689481 2568 flags.go:64] FLAG: --eviction-soft="" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689484 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689487 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689490 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689493 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689496 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689499 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689503 2568 flags.go:64] FLAG: --feature-gates="" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689507 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689510 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689513 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689532 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689536 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 16 16:47:43.691707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689539 2568 flags.go:64] FLAG: --help="false" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689542 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-131-63.ec2.internal" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689545 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689548 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689551 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689555 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689558 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689561 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689564 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689567 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689570 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689573 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689576 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689579 2568 flags.go:64] FLAG: --kube-reserved="" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689582 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689585 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689588 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689591 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689594 2568 flags.go:64] FLAG: --lock-file="" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689597 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689600 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689603 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689609 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 16:47:43.692364 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689612 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689615 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689618 2568 flags.go:64] FLAG: --logging-format="text" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689622 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689626 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689629 2568 flags.go:64] FLAG: --manifest-url="" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689633 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689637 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689641 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689645 2568 flags.go:64] FLAG: --max-pods="110" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689648 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689652 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689654 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689657 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689660 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689664 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689667 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689675 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689678 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689681 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689685 2568 flags.go:64] FLAG: --pod-cidr="" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689689 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689694 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689697 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 16:47:43.692923 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689701 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689704 2568 flags.go:64] FLAG: --port="10250" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689707 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689710 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-034e2ce71c6acd65e" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689713 2568 flags.go:64] FLAG: --qos-reserved="" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689716 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689719 2568 flags.go:64] FLAG: --register-node="true" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689723 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689726 2568 flags.go:64] FLAG: --register-with-taints="" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689735 2568 flags.go:64] FLAG: --registry-burst="10" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689738 2568 flags.go:64] FLAG: --registry-qps="5" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689741 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689744 2568 flags.go:64] FLAG: --reserved-memory="" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689748 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689751 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689754 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689757 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689760 2568 flags.go:64] FLAG: --runonce="false" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689763 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689766 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689769 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689772 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689775 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689778 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689781 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689785 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 16:47:43.693495 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689787 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689790 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689793 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689797 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689800 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689803 2568 flags.go:64] FLAG: --system-cgroups="" Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689805 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689811 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689814 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689817 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689821 2568 flags.go:64] FLAG: --tls-min-version="" Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689824 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689827 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689830 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689833 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689836 2568 flags.go:64] FLAG: --v="2" Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689841 2568 flags.go:64] FLAG: --version="false" Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689845 2568 flags.go:64] FLAG: --vmodule="" Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689849 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.689853 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.689954 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.689958 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.689961 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.689964 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:47:43.694165 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.689967 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:47:43.694747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.689970 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:47:43.694747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.689973 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:47:43.694747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.689976 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:47:43.694747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.689978 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:47:43.694747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.689981 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:47:43.694747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.689983 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:47:43.694747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.689986 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:47:43.694747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.689989 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:47:43.694747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.689991 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:47:43.694747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.689994 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:47:43.694747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.689997 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:47:43.694747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690000 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:47:43.694747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690003 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:47:43.694747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690005 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:47:43.694747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690008 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:47:43.694747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690011 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:47:43.694747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690014 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:47:43.694747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690022 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:47:43.694747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690025 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:47:43.694747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690027 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:47:43.695240 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690030 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:47:43.695240 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690033 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:47:43.695240 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690036 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:47:43.695240 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690039 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:47:43.695240 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690041 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:47:43.695240 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690044 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:47:43.695240 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690047 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:47:43.695240 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690050 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:47:43.695240 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690052 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:47:43.695240 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690055 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:47:43.695240 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690058 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:47:43.695240 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690060 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:47:43.695240 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690063 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:47:43.695240 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690065 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:47:43.695240 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690068 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:47:43.695240 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690071 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:47:43.695240 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690073 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:47:43.695240 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690076 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:47:43.695240 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690079 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:47:43.695240 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690083 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:47:43.695747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690085 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:47:43.695747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690088 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:47:43.695747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690090 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:47:43.695747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690093 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:47:43.695747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690096 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:47:43.695747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690099 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:47:43.695747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690101 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:47:43.695747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690104 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:47:43.695747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690106 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:47:43.695747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690110 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:47:43.695747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690113 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:47:43.695747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690116 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:47:43.695747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690118 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:47:43.695747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690121 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:47:43.695747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690124 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:47:43.695747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690128 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:47:43.695747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690131 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:47:43.695747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690134 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:47:43.695747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690137 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:47:43.695747 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690140 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:47:43.696235 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690142 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:47:43.696235 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690145 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:47:43.696235 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690148 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:47:43.696235 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690150 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:47:43.696235 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690153 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:47:43.696235 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690156 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:47:43.696235 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690158 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:47:43.696235 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690161 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:47:43.696235 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690163 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:47:43.696235 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690166 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:47:43.696235 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690168 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:47:43.696235 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690173 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:47:43.696235 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690175 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:47:43.696235 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690178 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:47:43.696235 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690182 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:47:43.696235 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690185 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:47:43.696235 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690188 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:47:43.696235 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690191 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:47:43.696235 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690194 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:47:43.696720 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690197 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:47:43.696720 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.690200 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:47:43.696720 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.691055 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:47:43.697792 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.697771 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 16:47:43.697792 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.697792 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 16:47:43.697856 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697842 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:47:43.697856 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697847 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:47:43.697856 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697851 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:47:43.697856 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697854 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:47:43.697856 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697857 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:47:43.698015 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697861 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:47:43.698015 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697864 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:47:43.698015 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697867 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:47:43.698015 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697871 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:47:43.698015 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697875 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:47:43.698015 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697878 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:47:43.698015 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697881 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:47:43.698015 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697885 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:47:43.698015 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697888 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:47:43.698015 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697890 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:47:43.698015 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697893 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:47:43.698015 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697898 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:47:43.698015 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697902 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:47:43.698015 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697905 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:47:43.698015 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697908 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:47:43.698015 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697911 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:47:43.698015 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697914 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:47:43.698015 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697916 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:47:43.698015 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697919 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:47:43.698473 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697922 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:47:43.698473 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697925 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:47:43.698473 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697928 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:47:43.698473 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697931 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:47:43.698473 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697934 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:47:43.698473 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697937 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:47:43.698473 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697939 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:47:43.698473 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697942 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:47:43.698473 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697944 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:47:43.698473 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697947 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:47:43.698473 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697950 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:47:43.698473 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697952 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:47:43.698473 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697956 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:47:43.698473 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697958 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:47:43.698473 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697962 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:47:43.698473 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697965 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:47:43.698473 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697968 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:47:43.698473 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697971 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:47:43.698473 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697973 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:47:43.698473 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697976 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:47:43.698972 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697979 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:47:43.698972 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697983 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:47:43.698972 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697986 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:47:43.698972 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697989 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:47:43.698972 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697992 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:47:43.698972 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697995 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:47:43.698972 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.697997 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:47:43.698972 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698000 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:47:43.698972 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698003 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:47:43.698972 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698005 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:47:43.698972 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698008 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:47:43.698972 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698010 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:47:43.698972 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698013 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:47:43.698972 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698016 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:47:43.698972 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698019 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:47:43.698972 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698022 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:47:43.698972 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698025 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:47:43.698972 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698028 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:47:43.698972 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698031 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:47:43.698972 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698033 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:47:43.699451 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698036 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:47:43.699451 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698038 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:47:43.699451 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698041 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:47:43.699451 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698044 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:47:43.699451 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698046 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:47:43.699451 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698049 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:47:43.699451 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698052 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:47:43.699451 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698055 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:47:43.699451 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698057 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:47:43.699451 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698060 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:47:43.699451 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698062 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:47:43.699451 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698065 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:47:43.699451 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698068 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:47:43.699451 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698071 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:47:43.699451 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698073 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:47:43.699451 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698076 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:47:43.699451 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698079 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:47:43.699451 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698081 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:47:43.699451 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698084 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:47:43.699932 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698087 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:47:43.699932 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698090 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:47:43.699932 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698092 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:47:43.699932 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.698098 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:47:43.699932 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698216 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:47:43.699932 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698222 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:47:43.699932 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698224 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:47:43.699932 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698228 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:47:43.699932 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698231 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:47:43.699932 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698235 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:47:43.699932 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698238 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:47:43.699932 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698241 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:47:43.699932 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698244 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:47:43.699932 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698246 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:47:43.699932 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698249 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:47:43.700297 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698252 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:47:43.700297 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698255 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:47:43.700297 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698258 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:47:43.700297 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698261 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:47:43.700297 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698264 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:47:43.700297 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698267 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:47:43.700297 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698269 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:47:43.700297 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698272 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:47:43.700297 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698275 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:47:43.700297 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698277 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:47:43.700297 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698280 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:47:43.700297 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698283 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:47:43.700297 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698286 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:47:43.700297 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698288 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:47:43.700297 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698291 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:47:43.700297 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698293 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:47:43.700297 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698296 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:47:43.700297 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698299 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:47:43.700297 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698301 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:47:43.700297 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698304 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:47:43.700795 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698307 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:47:43.700795 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698310 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:47:43.700795 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698313 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:47:43.700795 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698315 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:47:43.700795 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698318 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:47:43.700795 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698321 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:47:43.700795 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698324 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:47:43.700795 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698327 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:47:43.700795 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698329 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:47:43.700795 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698332 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:47:43.700795 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698335 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:47:43.700795 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698337 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:47:43.700795 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698340 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:47:43.700795 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698342 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:47:43.700795 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698345 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:47:43.700795 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698347 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:47:43.700795 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698350 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:47:43.700795 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698353 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:47:43.700795 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698357 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:47:43.700795 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698360 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:47:43.701353 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698363 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:47:43.701353 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698366 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:47:43.701353 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698369 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:47:43.701353 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698372 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:47:43.701353 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698374 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:47:43.701353 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698377 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:47:43.701353 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698379 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:47:43.701353 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698382 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:47:43.701353 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698385 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:47:43.701353 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698387 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:47:43.701353 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698390 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:47:43.701353 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698392 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:47:43.701353 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698395 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:47:43.701353 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698398 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:47:43.701353 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698401 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:47:43.701353 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698403 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:47:43.701353 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698406 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:47:43.701353 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698408 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:47:43.701353 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698411 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:47:43.701353 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698414 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:47:43.701948 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698417 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:47:43.701948 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698420 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:47:43.701948 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698422 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:47:43.701948 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698425 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:47:43.701948 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698427 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:47:43.701948 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698430 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:47:43.701948 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698432 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:47:43.701948 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698436 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:47:43.701948 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698439 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:47:43.701948 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698442 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:47:43.701948 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698445 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:47:43.701948 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698447 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:47:43.701948 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698450 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:47:43.701948 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698452 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:47:43.701948 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:43.698455 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:47:43.702313 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.698460 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:47:43.702313 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.699307 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 16:47:43.702313 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.701494 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 16:47:43.702670 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.702658 2568 server.go:1019] "Starting client certificate rotation" Apr 16 16:47:43.702769 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.702754 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:47:43.702803 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.702792 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:47:43.728748 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.728725 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:47:43.731886 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.731852 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:47:43.751247 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.751219 2568 log.go:25] "Validated CRI v1 runtime API" Apr 16 16:47:43.756900 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.756872 2568 log.go:25] "Validated CRI v1 image API" Apr 16 16:47:43.758133 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.758117 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 16:47:43.758980 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.758964 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:47:43.762409 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.762386 2568 fs.go:135] Filesystem UUIDs: map[676ec864-f336-4d5e-bac6-f4444c0bcbbb:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 9d1ce0e1-ff86-4fe3-af78-0b23b8690eb4:/dev/nvme0n1p3] Apr 16 16:47:43.762499 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.762407 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 16:47:43.768358 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.768241 2568 manager.go:217] Machine: {Timestamp:2026-04-16 16:47:43.766196949 +0000 UTC m=+0.436373731 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099433 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2390b95f7b0d9a7826f25fd7985a58 SystemUUID:ec2390b9-5f7b-0d9a-7826-f25fd7985a58 BootID:6cf296c3-0579-4625-8396-8ff158c7fcc5 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:cc:25:c8:9d:b5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:cc:25:c8:9d:b5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a2:50:a0:75:70:cb Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 16:47:43.768358 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.768344 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 16:47:43.768549 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.768514 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 16:47:43.769597 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.769572 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 16:47:43.769791 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.769599 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-63.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 16:47:43.769869 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.769806 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 16:47:43.769869 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.769818 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 16:47:43.769869 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.769837 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:47:43.771050 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.771038 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:47:43.772557 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.772545 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:47:43.772870 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.772858 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 16:47:43.775939 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.775928 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 16 16:47:43.776021 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.775962 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 16:47:43.776021 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.775984 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 16:47:43.776021 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.775998 2568 kubelet.go:397] "Adding apiserver pod source" Apr 16 16:47:43.776138 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.776031 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 16:47:43.777393 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.777379 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:47:43.777475 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.777402 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:47:43.780857 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.780843 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 16:47:43.782092 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.782066 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-s24zg" Apr 16 16:47:43.782754 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.782740 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 16:47:43.785176 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.785164 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 16:47:43.785248 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.785181 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 16:47:43.785248 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.785187 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 16:47:43.785248 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.785193 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 16:47:43.785248 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.785200 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 16:47:43.785248 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.785209 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 16:47:43.785248 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.785217 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 16:47:43.785248 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.785222 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 16:47:43.785248 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.785231 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 16:47:43.785248 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.785237 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 16:47:43.785248 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.785255 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 16:47:43.785598 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.785265 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 16:47:43.786250 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:43.786227 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-63.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 16:47:43.786339 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.786283 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 16:47:43.786339 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.786297 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 16:47:43.786339 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:43.786321 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 16:47:43.789139 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.789123 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-s24zg" Apr 16 16:47:43.790666 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.790651 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 16:47:43.790736 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.790690 2568 server.go:1295] "Started kubelet" Apr 16 16:47:43.790800 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.790771 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 16:47:43.791181 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.791001 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 16:47:43.791306 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.791293 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 16:47:43.791660 ip-10-0-131-63 systemd[1]: Started Kubernetes Kubelet. Apr 16 16:47:43.792604 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.792564 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 16:47:43.797172 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.797142 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 16 16:47:43.800799 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.800737 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 16:47:43.800799 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.800754 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 16:47:43.801699 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:43.801676 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-63.ec2.internal\" not found" Apr 16 16:47:43.802002 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.801983 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 16:47:43.802002 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.802003 2568 factory.go:55] Registering systemd factory Apr 16 16:47:43.802124 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.802013 2568 factory.go:223] Registration of the systemd container factory successfully Apr 16 16:47:43.802124 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.802064 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 16:47:43.802124 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.802068 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 16:47:43.802124 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.802094 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 16:47:43.802308 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.802269 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 16 16:47:43.802308 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.802278 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 16 16:47:43.802394 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.802333 2568 factory.go:153] Registering CRI-O factory Apr 16 16:47:43.802394 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.802349 2568 factory.go:223] Registration of the crio container factory successfully Apr 16 16:47:43.802394 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.802373 2568 factory.go:103] Registering Raw factory Apr 16 16:47:43.802394 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.802386 2568 manager.go:1196] Started watching for new ooms in manager Apr 16 16:47:43.802758 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.802736 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:43.802874 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.802862 2568 manager.go:319] Starting recovery of all containers Apr 16 16:47:43.803850 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:43.803808 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 16:47:43.807679 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:43.807447 2568 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-63.ec2.internal\" not found" node="ip-10-0-131-63.ec2.internal" Apr 16 16:47:43.807679 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.807487 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-63.ec2.internal" not found Apr 16 16:47:43.813344 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.813327 2568 manager.go:324] Recovery completed Apr 16 16:47:43.817687 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.817674 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:43.820360 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.820344 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-63.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:43.820439 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.820372 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:43.820439 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.820383 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-63.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:43.820906 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.820893 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 16:47:43.820906 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.820903 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 16:47:43.821003 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.820918 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:47:43.823430 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.823418 2568 policy_none.go:49] "None policy: Start" Apr 16 16:47:43.823471 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.823434 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 16:47:43.823471 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.823444 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 16 16:47:43.824454 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.824424 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-63.ec2.internal" not found Apr 16 16:47:43.860103 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.860085 2568 manager.go:341] "Starting Device Plugin manager" Apr 16 16:47:43.871020 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:43.860176 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 16:47:43.871020 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.860191 2568 server.go:85] "Starting device plugin registration server" Apr 16 16:47:43.871020 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.860449 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 16:47:43.871020 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.860460 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 16:47:43.871020 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.860589 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 16:47:43.871020 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.860664 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 16:47:43.871020 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.860674 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 16:47:43.871020 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:43.861176 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 16:47:43.871020 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:43.861207 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-63.ec2.internal\" not found" Apr 16 16:47:43.882491 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.882472 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-63.ec2.internal" not found Apr 16 16:47:43.927807 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.927723 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 16:47:43.928879 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.928860 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 16:47:43.928938 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.928897 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 16:47:43.928938 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.928921 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 16:47:43.928938 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.928930 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 16:47:43.929040 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:43.928974 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 16:47:43.932760 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.932743 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:43.961433 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.961399 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:43.962549 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.962532 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-63.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:43.962620 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.962563 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:43.962620 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.962579 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-63.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:43.962620 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.962602 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-63.ec2.internal" Apr 16 16:47:43.971170 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:43.971154 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-63.ec2.internal" Apr 16 16:47:43.971259 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:43.971174 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-63.ec2.internal\": node \"ip-10-0-131-63.ec2.internal\" not found" Apr 16 16:47:43.985590 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:43.985572 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-63.ec2.internal\" not found" Apr 16 16:47:44.029579 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.029548 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-63.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-63.ec2.internal"] Apr 16 16:47:44.029672 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.029628 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:44.031110 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.031095 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-63.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:44.031159 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.031129 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:44.031159 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.031141 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-63.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:44.032378 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.032366 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:44.032511 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.032497 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-63.ec2.internal" Apr 16 16:47:44.032573 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.032540 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:44.033051 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.033032 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-63.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:44.033129 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.033038 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-63.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:44.033129 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.033088 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:44.033129 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.033099 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-63.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:44.033129 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.033067 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:44.033263 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.033139 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-63.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:44.035070 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.035054 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-63.ec2.internal" Apr 16 16:47:44.035151 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.035080 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:44.035682 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.035668 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-63.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:44.035762 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.035696 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:44.035762 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.035707 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-63.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:44.058226 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:44.058203 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-63.ec2.internal\" not found" node="ip-10-0-131-63.ec2.internal" Apr 16 16:47:44.062424 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:44.062405 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-63.ec2.internal\" not found" node="ip-10-0-131-63.ec2.internal" Apr 16 16:47:44.085838 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:44.085817 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-63.ec2.internal\" not found" Apr 16 16:47:44.103866 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.103843 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e3c2b6dbe3e0a746e2afaa0da10e7815-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-63.ec2.internal\" (UID: \"e3c2b6dbe3e0a746e2afaa0da10e7815\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-63.ec2.internal" Apr 16 16:47:44.103948 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.103875 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3c2b6dbe3e0a746e2afaa0da10e7815-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-63.ec2.internal\" (UID: \"e3c2b6dbe3e0a746e2afaa0da10e7815\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-63.ec2.internal" Apr 16 16:47:44.103948 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.103897 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/efe8b482663d36b8cd0f3c122fed91e1-config\") pod \"kube-apiserver-proxy-ip-10-0-131-63.ec2.internal\" (UID: \"efe8b482663d36b8cd0f3c122fed91e1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-63.ec2.internal" Apr 16 16:47:44.186172 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:44.186102 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-63.ec2.internal\" not found" Apr 16 16:47:44.204529 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.204493 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e3c2b6dbe3e0a746e2afaa0da10e7815-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-63.ec2.internal\" (UID: \"e3c2b6dbe3e0a746e2afaa0da10e7815\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-63.ec2.internal" Apr 16 16:47:44.204589 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.204552 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3c2b6dbe3e0a746e2afaa0da10e7815-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-63.ec2.internal\" (UID: \"e3c2b6dbe3e0a746e2afaa0da10e7815\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-63.ec2.internal" Apr 16 16:47:44.204589 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.204565 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e3c2b6dbe3e0a746e2afaa0da10e7815-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-63.ec2.internal\" (UID: \"e3c2b6dbe3e0a746e2afaa0da10e7815\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-63.ec2.internal" Apr 16 16:47:44.204648 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.204572 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/efe8b482663d36b8cd0f3c122fed91e1-config\") pod \"kube-apiserver-proxy-ip-10-0-131-63.ec2.internal\" (UID: \"efe8b482663d36b8cd0f3c122fed91e1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-63.ec2.internal" Apr 16 16:47:44.204648 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.204593 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/efe8b482663d36b8cd0f3c122fed91e1-config\") pod \"kube-apiserver-proxy-ip-10-0-131-63.ec2.internal\" (UID: \"efe8b482663d36b8cd0f3c122fed91e1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-63.ec2.internal" Apr 16 16:47:44.204754 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.204646 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3c2b6dbe3e0a746e2afaa0da10e7815-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-63.ec2.internal\" (UID: \"e3c2b6dbe3e0a746e2afaa0da10e7815\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-63.ec2.internal" Apr 16 16:47:44.286992 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:44.286950 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-63.ec2.internal\" not found" Apr 16 16:47:44.362543 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.362506 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-63.ec2.internal" Apr 16 16:47:44.365195 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.365180 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-63.ec2.internal" Apr 16 16:47:44.387583 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:44.387556 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-63.ec2.internal\" not found" Apr 16 16:47:44.488219 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:44.488147 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-63.ec2.internal\" not found" Apr 16 16:47:44.588731 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:44.588690 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-63.ec2.internal\" not found" Apr 16 16:47:44.689247 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:44.689226 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-63.ec2.internal\" not found" Apr 16 16:47:44.702477 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.702457 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 16:47:44.702609 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.702591 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:47:44.702673 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.702601 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:47:44.790230 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:44.790146 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-63.ec2.internal\" not found" Apr 16 16:47:44.791263 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.791233 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 16:42:43 +0000 UTC" deadline="2027-12-06 16:20:24.04713061 +0000 UTC" Apr 16 16:47:44.791314 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.791263 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14375h32m39.255870401s" Apr 16 16:47:44.801404 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.801384 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 16:47:44.825054 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.825030 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:47:44.844322 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.844292 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-t9r9b" Apr 16 16:47:44.852180 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.852155 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-t9r9b" Apr 16 16:47:44.866352 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.866333 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:44.890473 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:44.890446 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-63.ec2.internal\" not found" Apr 16 16:47:44.962638 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:44.962584 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3c2b6dbe3e0a746e2afaa0da10e7815.slice/crio-5b765e65f9ad63b34b0f513d8f3a8ddb213f75be4a3240fb648bfeb782e1ab89 WatchSource:0}: Error finding container 5b765e65f9ad63b34b0f513d8f3a8ddb213f75be4a3240fb648bfeb782e1ab89: Status 404 returned error can't find the container with id 5b765e65f9ad63b34b0f513d8f3a8ddb213f75be4a3240fb648bfeb782e1ab89 Apr 16 16:47:44.962901 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:44.962876 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefe8b482663d36b8cd0f3c122fed91e1.slice/crio-5dd34d5b9df5d551012c252abf6bf81ebec770e91af8f3035a72ea3ead2e2913 WatchSource:0}: Error finding container 5dd34d5b9df5d551012c252abf6bf81ebec770e91af8f3035a72ea3ead2e2913: Status 404 returned error can't find the container with id 5dd34d5b9df5d551012c252abf6bf81ebec770e91af8f3035a72ea3ead2e2913 Apr 16 16:47:44.967136 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:44.967121 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:47:44.991162 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:44.991124 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-63.ec2.internal\" not found" Apr 16 16:47:45.091792 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:45.091711 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-63.ec2.internal\" not found" Apr 16 16:47:45.192271 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:45.192236 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-63.ec2.internal\" not found" Apr 16 16:47:45.293148 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:45.293116 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-63.ec2.internal\" not found" Apr 16 16:47:45.377641 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.377575 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:45.401398 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.401365 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-63.ec2.internal" Apr 16 16:47:45.416392 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.416359 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:47:45.417588 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.417567 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-63.ec2.internal" Apr 16 16:47:45.427508 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.427480 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:47:45.627470 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.627439 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:45.708488 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.708404 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:45.777954 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.777922 2568 apiserver.go:52] "Watching apiserver" Apr 16 16:47:45.786131 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.786094 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 16:47:45.788263 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.788239 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-nwmhb","openshift-network-operator/iptables-alerter-j6gfs","openshift-ovn-kubernetes/ovnkube-node-s2x9b","kube-system/kube-apiserver-proxy-ip-10-0-131-63.ec2.internal","openshift-dns/node-resolver-ptmqw","openshift-multus/multus-additional-cni-plugins-brchx","openshift-multus/network-metrics-daemon-5rwjz","openshift-network-diagnostics/network-check-target-ts7hl","kube-system/konnectivity-agent-d4xtv","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59","openshift-cluster-node-tuning-operator/tuned-7p8g2","openshift-image-registry/node-ca-j8nrt","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-63.ec2.internal"] Apr 16 16:47:45.791428 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.791407 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ptmqw" Apr 16 16:47:45.793141 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.792883 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j8nrt" Apr 16 16:47:45.793141 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.793036 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.794164 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.794134 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 16:47:45.794421 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.794402 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 16:47:45.795555 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.795298 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 16:47:45.795869 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.795685 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 16:47:45.795869 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.795713 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-fjtgd\"" Apr 16 16:47:45.796068 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.795991 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-l4c4x\"" Apr 16 16:47:45.796147 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.796116 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 16:47:45.797140 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.796228 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 16:47:45.797140 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.796870 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 16:47:45.797140 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.796914 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 16:47:45.797389 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.797258 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-6nfnn\"" Apr 16 16:47:45.797443 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.797391 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 16:47:45.797674 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.797531 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 16:47:45.797959 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.797858 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.799311 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.799294 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 16:47:45.800208 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.800184 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.800208 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.800198 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:47:45.800501 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:45.800474 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5rwjz" podUID="65f280f9-caf6-429e-ac03-31bd647a05b6" Apr 16 16:47:45.801274 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.801255 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 16:47:45.801664 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.801638 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 16:47:45.801892 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.801876 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-4shpx\"" Apr 16 16:47:45.801982 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.801965 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 16:47:45.802277 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.802257 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:47:45.802350 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:45.802315 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ts7hl" podUID="018fd32e-3479-4227-9d81-8a232b27fc2b" Apr 16 16:47:45.802626 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.802597 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 16:47:45.802757 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.802740 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 16:47:45.803063 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.803043 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 16:47:45.803132 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.803107 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 16:47:45.803202 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.803178 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-f8b9c\"" Apr 16 16:47:45.803283 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.803259 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 16:47:45.804058 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.804039 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-d4xtv" Apr 16 16:47:45.805641 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.805620 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nwmhb" Apr 16 16:47:45.806346 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.806328 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-88mzj\"" Apr 16 16:47:45.806435 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.806358 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 16:47:45.806684 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.806670 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 16:47:45.808097 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.808081 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 16:47:45.808186 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.808152 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-657sj\"" Apr 16 16:47:45.808888 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.808641 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:45.808888 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.808824 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-j6gfs" Apr 16 16:47:45.811041 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.811023 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:47:45.811135 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.811086 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-b4xpn\"" Apr 16 16:47:45.811300 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.811285 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 16:47:45.811354 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.811300 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:47:45.811682 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.811661 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jx86s\"" Apr 16 16:47:45.811758 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.811683 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 16:47:45.811824 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.811779 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 16:47:45.814851 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.814830 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs\") pod \"network-metrics-daemon-5rwjz\" (UID: \"65f280f9-caf6-429e-ac03-31bd647a05b6\") " pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:47:45.814943 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.814865 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40202585-b938-4a5c-bde8-ac1c5ea40044-ovnkube-config\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.814943 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.814891 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40202585-b938-4a5c-bde8-ac1c5ea40044-env-overrides\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.814943 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.814913 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-host-run-netns\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.814943 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.814940 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0c447c29-c470-4314-becc-ad24580321c8-hosts-file\") pod \"node-resolver-ptmqw\" (UID: \"0c447c29-c470-4314-becc-ad24580321c8\") " pod="openshift-dns/node-resolver-ptmqw" Apr 16 16:47:45.815141 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.814963 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0c447c29-c470-4314-becc-ad24580321c8-tmp-dir\") pod \"node-resolver-ptmqw\" (UID: \"0c447c29-c470-4314-becc-ad24580321c8\") " pod="openshift-dns/node-resolver-ptmqw" Apr 16 16:47:45.815141 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815006 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-var-lib-openvswitch\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.815141 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815042 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-host-run-ovn-kubernetes\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.815141 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815066 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gz4z\" (UniqueName: \"kubernetes.io/projected/0c447c29-c470-4314-becc-ad24580321c8-kube-api-access-8gz4z\") pod \"node-resolver-ptmqw\" (UID: \"0c447c29-c470-4314-becc-ad24580321c8\") " pod="openshift-dns/node-resolver-ptmqw" Apr 16 16:47:45.815141 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815090 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-etc-openvswitch\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.815141 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815113 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6ee9ccfb-aa99-4334-93d4-21dad3568cda-socket-dir\") pod \"aws-ebs-csi-driver-node-xzx59\" (UID: \"6ee9ccfb-aa99-4334-93d4-21dad3568cda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.815422 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815164 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6ee9ccfb-aa99-4334-93d4-21dad3568cda-registration-dir\") pod \"aws-ebs-csi-driver-node-xzx59\" (UID: \"6ee9ccfb-aa99-4334-93d4-21dad3568cda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.815422 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815188 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6ee9ccfb-aa99-4334-93d4-21dad3568cda-device-dir\") pod \"aws-ebs-csi-driver-node-xzx59\" (UID: \"6ee9ccfb-aa99-4334-93d4-21dad3568cda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.815422 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815223 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmzvh\" (UniqueName: \"kubernetes.io/projected/6ee9ccfb-aa99-4334-93d4-21dad3568cda-kube-api-access-rmzvh\") pod \"aws-ebs-csi-driver-node-xzx59\" (UID: \"6ee9ccfb-aa99-4334-93d4-21dad3568cda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.815422 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815252 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/482c17e3-998c-48aa-b158-037aa6ebf920-tuning-conf-dir\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.815422 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815278 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/482c17e3-998c-48aa-b158-037aa6ebf920-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.815422 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815301 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40202585-b938-4a5c-bde8-ac1c5ea40044-ovn-node-metrics-cert\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.815422 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815326 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ee9ccfb-aa99-4334-93d4-21dad3568cda-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xzx59\" (UID: \"6ee9ccfb-aa99-4334-93d4-21dad3568cda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.815422 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815348 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/482c17e3-998c-48aa-b158-037aa6ebf920-cnibin\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.815422 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815378 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c39d9cc0-8cac-46bd-968d-baba878cd954-serviceca\") pod \"node-ca-j8nrt\" (UID: \"c39d9cc0-8cac-46bd-968d-baba878cd954\") " pod="openshift-image-registry/node-ca-j8nrt" Apr 16 16:47:45.815422 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815421 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-run-systemd\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.815900 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815445 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6ee9ccfb-aa99-4334-93d4-21dad3568cda-etc-selinux\") pod \"aws-ebs-csi-driver-node-xzx59\" (UID: \"6ee9ccfb-aa99-4334-93d4-21dad3568cda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.815900 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815470 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8vpv\" (UniqueName: \"kubernetes.io/projected/482c17e3-998c-48aa-b158-037aa6ebf920-kube-api-access-n8vpv\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.815900 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815531 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-host-cni-netd\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.815900 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815574 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.815900 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815600 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/40202585-b938-4a5c-bde8-ac1c5ea40044-ovnkube-script-lib\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.815900 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815626 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khcvt\" (UniqueName: \"kubernetes.io/projected/c39d9cc0-8cac-46bd-968d-baba878cd954-kube-api-access-khcvt\") pod \"node-ca-j8nrt\" (UID: \"c39d9cc0-8cac-46bd-968d-baba878cd954\") " pod="openshift-image-registry/node-ca-j8nrt" Apr 16 16:47:45.815900 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815649 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/482c17e3-998c-48aa-b158-037aa6ebf920-os-release\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.815900 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815689 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd4vd\" (UniqueName: \"kubernetes.io/projected/65f280f9-caf6-429e-ac03-31bd647a05b6-kube-api-access-qd4vd\") pod \"network-metrics-daemon-5rwjz\" (UID: \"65f280f9-caf6-429e-ac03-31bd647a05b6\") " pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:47:45.815900 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815712 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-run-openvswitch\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.815900 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815735 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/482c17e3-998c-48aa-b158-037aa6ebf920-system-cni-dir\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.815900 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815783 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-host-kubelet\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.815900 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815818 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-systemd-units\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.815900 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815857 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-run-ovn\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.815900 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815877 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-log-socket\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.815900 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815905 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-host-cni-bin\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.816583 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815935 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/482c17e3-998c-48aa-b158-037aa6ebf920-cni-binary-copy\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.816583 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815965 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/482c17e3-998c-48aa-b158-037aa6ebf920-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.816583 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.815988 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c39d9cc0-8cac-46bd-968d-baba878cd954-host\") pod \"node-ca-j8nrt\" (UID: \"c39d9cc0-8cac-46bd-968d-baba878cd954\") " pod="openshift-image-registry/node-ca-j8nrt" Apr 16 16:47:45.816583 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.816050 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-host-slash\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.816583 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.816072 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-node-log\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.816583 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.816095 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nng79\" (UniqueName: \"kubernetes.io/projected/40202585-b938-4a5c-bde8-ac1c5ea40044-kube-api-access-nng79\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.816583 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.816109 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6ee9ccfb-aa99-4334-93d4-21dad3568cda-sys-fs\") pod \"aws-ebs-csi-driver-node-xzx59\" (UID: \"6ee9ccfb-aa99-4334-93d4-21dad3568cda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.816583 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.816124 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2x5p\" (UniqueName: \"kubernetes.io/projected/018fd32e-3479-4227-9d81-8a232b27fc2b-kube-api-access-h2x5p\") pod \"network-check-target-ts7hl\" (UID: \"018fd32e-3479-4227-9d81-8a232b27fc2b\") " pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:47:45.853130 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.853092 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:42:44 +0000 UTC" deadline="2027-11-12 10:13:15.352975301 +0000 UTC" Apr 16 16:47:45.853233 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.853131 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13793h25m29.499848138s" Apr 16 16:47:45.903429 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.903399 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 16:47:45.917335 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917295 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-systemd-units\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.917511 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917349 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-run-ovn\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.917511 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917379 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-host-cni-bin\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.917511 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917406 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/482c17e3-998c-48aa-b158-037aa6ebf920-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.917511 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917440 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-multus-socket-dir-parent\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:45.917511 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917407 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-run-ovn\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.917511 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917407 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-systemd-units\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.917511 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917450 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-host-cni-bin\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.917511 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917484 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/49be5da9-85b8-4a24-8fec-db2c506efbbc-konnectivity-ca\") pod \"konnectivity-agent-d4xtv\" (UID: \"49be5da9-85b8-4a24-8fec-db2c506efbbc\") " pod="kube-system/konnectivity-agent-d4xtv" Apr 16 16:47:45.917871 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917557 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-node-log\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.917871 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917592 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6ee9ccfb-aa99-4334-93d4-21dad3568cda-sys-fs\") pod \"aws-ebs-csi-driver-node-xzx59\" (UID: \"6ee9ccfb-aa99-4334-93d4-21dad3568cda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.917871 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917599 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-node-log\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.917871 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917623 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/49be5da9-85b8-4a24-8fec-db2c506efbbc-agent-certs\") pod \"konnectivity-agent-d4xtv\" (UID: \"49be5da9-85b8-4a24-8fec-db2c506efbbc\") " pod="kube-system/konnectivity-agent-d4xtv" Apr 16 16:47:45.917871 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917645 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-etc-sysctl-conf\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:45.917871 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917660 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6ee9ccfb-aa99-4334-93d4-21dad3568cda-sys-fs\") pod \"aws-ebs-csi-driver-node-xzx59\" (UID: \"6ee9ccfb-aa99-4334-93d4-21dad3568cda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.917871 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917690 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mcc4\" (UniqueName: \"kubernetes.io/projected/9227133b-0000-4b75-818e-9c2ed2ffe214-kube-api-access-5mcc4\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:45.917871 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917721 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40202585-b938-4a5c-bde8-ac1c5ea40044-ovnkube-config\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.917871 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917747 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-host-run-netns\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.917871 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917770 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0c447c29-c470-4314-becc-ad24580321c8-hosts-file\") pod \"node-resolver-ptmqw\" (UID: \"0c447c29-c470-4314-becc-ad24580321c8\") " pod="openshift-dns/node-resolver-ptmqw" Apr 16 16:47:45.917871 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917823 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-host-run-netns\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.917871 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917864 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-host-run-ovn-kubernetes\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.918318 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917894 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/482c17e3-998c-48aa-b158-037aa6ebf920-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.918318 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917907 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0c447c29-c470-4314-becc-ad24580321c8-hosts-file\") pod \"node-resolver-ptmqw\" (UID: \"0c447c29-c470-4314-becc-ad24580321c8\") " pod="openshift-dns/node-resolver-ptmqw" Apr 16 16:47:45.918318 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917922 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8vpv\" (UniqueName: \"kubernetes.io/projected/482c17e3-998c-48aa-b158-037aa6ebf920-kube-api-access-n8vpv\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.918318 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917956 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-host-run-ovn-kubernetes\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.918318 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917969 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-etc-sysconfig\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:45.918318 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.917995 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-run\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:45.918318 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918023 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gz4z\" (UniqueName: \"kubernetes.io/projected/0c447c29-c470-4314-becc-ad24580321c8-kube-api-access-8gz4z\") pod \"node-resolver-ptmqw\" (UID: \"0c447c29-c470-4314-becc-ad24580321c8\") " pod="openshift-dns/node-resolver-ptmqw" Apr 16 16:47:45.918318 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918048 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-etc-openvswitch\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.918318 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918077 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmzvh\" (UniqueName: \"kubernetes.io/projected/6ee9ccfb-aa99-4334-93d4-21dad3568cda-kube-api-access-rmzvh\") pod \"aws-ebs-csi-driver-node-xzx59\" (UID: \"6ee9ccfb-aa99-4334-93d4-21dad3568cda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.918318 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918115 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/482c17e3-998c-48aa-b158-037aa6ebf920-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.918318 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918105 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-host-var-lib-cni-bin\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:45.918318 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918277 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9227133b-0000-4b75-818e-9c2ed2ffe214-etc-tuned\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:45.918860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918318 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ee9ccfb-aa99-4334-93d4-21dad3568cda-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xzx59\" (UID: \"6ee9ccfb-aa99-4334-93d4-21dad3568cda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.918860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918333 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40202585-b938-4a5c-bde8-ac1c5ea40044-ovnkube-config\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.918860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918362 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-etc-openvswitch\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.918860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918364 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/482c17e3-998c-48aa-b158-037aa6ebf920-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.918860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918380 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6ee9ccfb-aa99-4334-93d4-21dad3568cda-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xzx59\" (UID: \"6ee9ccfb-aa99-4334-93d4-21dad3568cda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.918860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918400 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/482c17e3-998c-48aa-b158-037aa6ebf920-cnibin\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.918860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918426 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-host-var-lib-cni-multus\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:45.918860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918450 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-etc-kubernetes\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:45.918860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918480 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/482c17e3-998c-48aa-b158-037aa6ebf920-cnibin\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.918860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918486 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c39d9cc0-8cac-46bd-968d-baba878cd954-serviceca\") pod \"node-ca-j8nrt\" (UID: \"c39d9cc0-8cac-46bd-968d-baba878cd954\") " pod="openshift-image-registry/node-ca-j8nrt" Apr 16 16:47:45.918860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918561 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-run-systemd\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.918860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918619 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-cnibin\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:45.918860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918637 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-run-systemd\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.918860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918668 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-os-release\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:45.918860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918710 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8d03bf36-9222-4d14-b16d-5f31b197f11a-cni-binary-copy\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:45.918860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918742 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/40202585-b938-4a5c-bde8-ac1c5ea40044-ovnkube-script-lib\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.918860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918785 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qd4vd\" (UniqueName: \"kubernetes.io/projected/65f280f9-caf6-429e-ac03-31bd647a05b6-kube-api-access-qd4vd\") pod \"network-metrics-daemon-5rwjz\" (UID: \"65f280f9-caf6-429e-ac03-31bd647a05b6\") " pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:47:45.919623 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918821 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c39d9cc0-8cac-46bd-968d-baba878cd954-serviceca\") pod \"node-ca-j8nrt\" (UID: \"c39d9cc0-8cac-46bd-968d-baba878cd954\") " pod="openshift-image-registry/node-ca-j8nrt" Apr 16 16:47:45.919623 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918894 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-host-run-multus-certs\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:45.919623 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918948 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-etc-kubernetes\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:45.919623 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.918975 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-etc-modprobe-d\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:45.919623 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919008 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-multus-cni-dir\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:45.919623 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919032 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndr87\" (UniqueName: \"kubernetes.io/projected/8d03bf36-9222-4d14-b16d-5f31b197f11a-kube-api-access-ndr87\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:45.919623 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919056 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-var-lib-kubelet\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:45.919623 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919085 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-host-kubelet\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.919623 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919112 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-log-socket\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.919623 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919137 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/482c17e3-998c-48aa-b158-037aa6ebf920-cni-binary-copy\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.919623 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919142 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-host-kubelet\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.919623 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919162 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2x5p\" (UniqueName: \"kubernetes.io/projected/018fd32e-3479-4227-9d81-8a232b27fc2b-kube-api-access-h2x5p\") pod \"network-check-target-ts7hl\" (UID: \"018fd32e-3479-4227-9d81-8a232b27fc2b\") " pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:47:45.919623 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919191 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-host-run-netns\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:45.919623 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919194 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-log-socket\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.919623 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919235 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/40202585-b938-4a5c-bde8-ac1c5ea40044-ovnkube-script-lib\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.919623 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919234 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1f2cee13-cf01-40a3-993d-f9a41ddeae81-host-slash\") pod \"iptables-alerter-j6gfs\" (UID: \"1f2cee13-cf01-40a3-993d-f9a41ddeae81\") " pod="openshift-network-operator/iptables-alerter-j6gfs" Apr 16 16:47:45.919623 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919277 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c39d9cc0-8cac-46bd-968d-baba878cd954-host\") pod \"node-ca-j8nrt\" (UID: \"c39d9cc0-8cac-46bd-968d-baba878cd954\") " pod="openshift-image-registry/node-ca-j8nrt" Apr 16 16:47:45.920333 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919301 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-host-slash\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.920333 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919326 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nng79\" (UniqueName: \"kubernetes.io/projected/40202585-b938-4a5c-bde8-ac1c5ea40044-kube-api-access-nng79\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.920333 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919350 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs\") pod \"network-metrics-daemon-5rwjz\" (UID: \"65f280f9-caf6-429e-ac03-31bd647a05b6\") " pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:47:45.920333 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919355 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c39d9cc0-8cac-46bd-968d-baba878cd954-host\") pod \"node-ca-j8nrt\" (UID: \"c39d9cc0-8cac-46bd-968d-baba878cd954\") " pod="openshift-image-registry/node-ca-j8nrt" Apr 16 16:47:45.920333 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919376 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-host-run-k8s-cni-cncf-io\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:45.920333 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919409 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-etc-sysctl-d\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:45.920333 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919431 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-host-slash\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.920333 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919435 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40202585-b938-4a5c-bde8-ac1c5ea40044-env-overrides\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.920333 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919482 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0c447c29-c470-4314-becc-ad24580321c8-tmp-dir\") pod \"node-resolver-ptmqw\" (UID: \"0c447c29-c470-4314-becc-ad24580321c8\") " pod="openshift-dns/node-resolver-ptmqw" Apr 16 16:47:45.920333 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919509 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-var-lib-openvswitch\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.920333 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919554 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6ee9ccfb-aa99-4334-93d4-21dad3568cda-socket-dir\") pod \"aws-ebs-csi-driver-node-xzx59\" (UID: \"6ee9ccfb-aa99-4334-93d4-21dad3568cda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.920333 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:45.919560 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:45.920333 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919578 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/482c17e3-998c-48aa-b158-037aa6ebf920-cni-binary-copy\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.920333 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919581 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6ee9ccfb-aa99-4334-93d4-21dad3568cda-registration-dir\") pod \"aws-ebs-csi-driver-node-xzx59\" (UID: \"6ee9ccfb-aa99-4334-93d4-21dad3568cda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.920333 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919636 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-var-lib-openvswitch\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.920333 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:45.919647 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs podName:65f280f9-caf6-429e-ac03-31bd647a05b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:46.41960964 +0000 UTC m=+3.089786401 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs") pod "network-metrics-daemon-5rwjz" (UID: "65f280f9-caf6-429e-ac03-31bd647a05b6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:45.920333 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919651 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6ee9ccfb-aa99-4334-93d4-21dad3568cda-registration-dir\") pod \"aws-ebs-csi-driver-node-xzx59\" (UID: \"6ee9ccfb-aa99-4334-93d4-21dad3568cda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.921014 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919680 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6ee9ccfb-aa99-4334-93d4-21dad3568cda-device-dir\") pod \"aws-ebs-csi-driver-node-xzx59\" (UID: \"6ee9ccfb-aa99-4334-93d4-21dad3568cda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.921014 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919694 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6ee9ccfb-aa99-4334-93d4-21dad3568cda-socket-dir\") pod \"aws-ebs-csi-driver-node-xzx59\" (UID: \"6ee9ccfb-aa99-4334-93d4-21dad3568cda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.921014 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919707 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/482c17e3-998c-48aa-b158-037aa6ebf920-tuning-conf-dir\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.921014 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919741 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-system-cni-dir\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:45.921014 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919763 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-host-var-lib-kubelet\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:45.921014 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919780 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-hostroot\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:45.921014 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919821 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8d03bf36-9222-4d14-b16d-5f31b197f11a-multus-daemon-config\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:45.921014 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919856 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9227133b-0000-4b75-818e-9c2ed2ffe214-tmp\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:45.921014 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919885 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40202585-b938-4a5c-bde8-ac1c5ea40044-ovn-node-metrics-cert\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.921014 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919905 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-etc-systemd\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:45.921014 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919907 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6ee9ccfb-aa99-4334-93d4-21dad3568cda-device-dir\") pod \"aws-ebs-csi-driver-node-xzx59\" (UID: \"6ee9ccfb-aa99-4334-93d4-21dad3568cda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.921014 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919923 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-host\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:45.921014 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919935 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0c447c29-c470-4314-becc-ad24580321c8-tmp-dir\") pod \"node-resolver-ptmqw\" (UID: \"0c447c29-c470-4314-becc-ad24580321c8\") " pod="openshift-dns/node-resolver-ptmqw" Apr 16 16:47:45.921014 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.919941 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6ee9ccfb-aa99-4334-93d4-21dad3568cda-etc-selinux\") pod \"aws-ebs-csi-driver-node-xzx59\" (UID: \"6ee9ccfb-aa99-4334-93d4-21dad3568cda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.921014 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.920003 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-multus-conf-dir\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:45.921014 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.920040 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40202585-b938-4a5c-bde8-ac1c5ea40044-env-overrides\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.921014 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.920054 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6ee9ccfb-aa99-4334-93d4-21dad3568cda-etc-selinux\") pod \"aws-ebs-csi-driver-node-xzx59\" (UID: \"6ee9ccfb-aa99-4334-93d4-21dad3568cda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.921557 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.920055 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-sys\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:45.921557 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.920105 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-host-cni-netd\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.921557 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.920145 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.921557 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.920174 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-host-cni-netd\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.921557 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.920222 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khcvt\" (UniqueName: \"kubernetes.io/projected/c39d9cc0-8cac-46bd-968d-baba878cd954-kube-api-access-khcvt\") pod \"node-ca-j8nrt\" (UID: \"c39d9cc0-8cac-46bd-968d-baba878cd954\") " pod="openshift-image-registry/node-ca-j8nrt" Apr 16 16:47:45.921557 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.920251 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/482c17e3-998c-48aa-b158-037aa6ebf920-os-release\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.921557 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.920261 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 16:47:45.921557 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.920273 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.921557 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.920316 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-lib-modules\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:45.921557 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.920346 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1f2cee13-cf01-40a3-993d-f9a41ddeae81-iptables-alerter-script\") pod \"iptables-alerter-j6gfs\" (UID: \"1f2cee13-cf01-40a3-993d-f9a41ddeae81\") " pod="openshift-network-operator/iptables-alerter-j6gfs" Apr 16 16:47:45.921557 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.920376 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/482c17e3-998c-48aa-b158-037aa6ebf920-os-release\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.921557 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.920420 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vghl6\" (UniqueName: \"kubernetes.io/projected/1f2cee13-cf01-40a3-993d-f9a41ddeae81-kube-api-access-vghl6\") pod \"iptables-alerter-j6gfs\" (UID: \"1f2cee13-cf01-40a3-993d-f9a41ddeae81\") " pod="openshift-network-operator/iptables-alerter-j6gfs" Apr 16 16:47:45.921557 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.920455 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-run-openvswitch\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.921557 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.920482 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/482c17e3-998c-48aa-b158-037aa6ebf920-system-cni-dir\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.921557 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.920529 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/40202585-b938-4a5c-bde8-ac1c5ea40044-run-openvswitch\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.921557 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.920587 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/482c17e3-998c-48aa-b158-037aa6ebf920-system-cni-dir\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.921557 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.920681 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/482c17e3-998c-48aa-b158-037aa6ebf920-tuning-conf-dir\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.923337 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.923318 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40202585-b938-4a5c-bde8-ac1c5ea40044-ovn-node-metrics-cert\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.928622 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:45.928591 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:47:45.928622 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:45.928615 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:47:45.928822 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:45.928646 2568 projected.go:194] Error preparing data for projected volume kube-api-access-h2x5p for pod openshift-network-diagnostics/network-check-target-ts7hl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:45.928822 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:45.928712 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/018fd32e-3479-4227-9d81-8a232b27fc2b-kube-api-access-h2x5p podName:018fd32e-3479-4227-9d81-8a232b27fc2b nodeName:}" failed. No retries permitted until 2026-04-16 16:47:46.428694233 +0000 UTC m=+3.098870993 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-h2x5p" (UniqueName: "kubernetes.io/projected/018fd32e-3479-4227-9d81-8a232b27fc2b-kube-api-access-h2x5p") pod "network-check-target-ts7hl" (UID: "018fd32e-3479-4227-9d81-8a232b27fc2b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:45.931015 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.930030 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gz4z\" (UniqueName: \"kubernetes.io/projected/0c447c29-c470-4314-becc-ad24580321c8-kube-api-access-8gz4z\") pod \"node-resolver-ptmqw\" (UID: \"0c447c29-c470-4314-becc-ad24580321c8\") " pod="openshift-dns/node-resolver-ptmqw" Apr 16 16:47:45.931296 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.931272 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khcvt\" (UniqueName: \"kubernetes.io/projected/c39d9cc0-8cac-46bd-968d-baba878cd954-kube-api-access-khcvt\") pod \"node-ca-j8nrt\" (UID: \"c39d9cc0-8cac-46bd-968d-baba878cd954\") " pod="openshift-image-registry/node-ca-j8nrt" Apr 16 16:47:45.931975 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.931507 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nng79\" (UniqueName: \"kubernetes.io/projected/40202585-b938-4a5c-bde8-ac1c5ea40044-kube-api-access-nng79\") pod \"ovnkube-node-s2x9b\" (UID: \"40202585-b938-4a5c-bde8-ac1c5ea40044\") " pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:45.931975 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.931741 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd4vd\" (UniqueName: \"kubernetes.io/projected/65f280f9-caf6-429e-ac03-31bd647a05b6-kube-api-access-qd4vd\") pod \"network-metrics-daemon-5rwjz\" (UID: \"65f280f9-caf6-429e-ac03-31bd647a05b6\") " pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:47:45.931975 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.931785 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmzvh\" (UniqueName: \"kubernetes.io/projected/6ee9ccfb-aa99-4334-93d4-21dad3568cda-kube-api-access-rmzvh\") pod \"aws-ebs-csi-driver-node-xzx59\" (UID: \"6ee9ccfb-aa99-4334-93d4-21dad3568cda\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:45.932594 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.932511 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8vpv\" (UniqueName: \"kubernetes.io/projected/482c17e3-998c-48aa-b158-037aa6ebf920-kube-api-access-n8vpv\") pod \"multus-additional-cni-plugins-brchx\" (UID: \"482c17e3-998c-48aa-b158-037aa6ebf920\") " pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:45.934108 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.934069 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-63.ec2.internal" event={"ID":"efe8b482663d36b8cd0f3c122fed91e1","Type":"ContainerStarted","Data":"5dd34d5b9df5d551012c252abf6bf81ebec770e91af8f3035a72ea3ead2e2913"} Apr 16 16:47:45.935423 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:45.935382 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-63.ec2.internal" event={"ID":"e3c2b6dbe3e0a746e2afaa0da10e7815","Type":"ContainerStarted","Data":"5b765e65f9ad63b34b0f513d8f3a8ddb213f75be4a3240fb648bfeb782e1ab89"} Apr 16 16:47:46.021703 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.021618 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-host-var-lib-cni-bin\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.021703 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.021667 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9227133b-0000-4b75-818e-9c2ed2ffe214-etc-tuned\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.021703 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.021693 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-host-var-lib-cni-multus\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.021954 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.021715 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-etc-kubernetes\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.021954 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.021741 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-host-var-lib-cni-bin\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.021954 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.021767 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-host-var-lib-cni-multus\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.021954 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.021782 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-etc-kubernetes\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.021954 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.021800 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-cnibin\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.021954 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.021830 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-os-release\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.021954 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.021853 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8d03bf36-9222-4d14-b16d-5f31b197f11a-cni-binary-copy\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.021954 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.021881 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-host-run-multus-certs\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.021954 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.021905 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-etc-kubernetes\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.021954 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.021928 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-os-release\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.021954 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.021931 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-etc-modprobe-d\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.022434 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.021980 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-etc-kubernetes\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.022434 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.021999 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-host-run-multus-certs\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.022434 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022022 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-etc-modprobe-d\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.022434 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022011 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-multus-cni-dir\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.022434 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022057 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-multus-cni-dir\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.022434 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022060 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndr87\" (UniqueName: \"kubernetes.io/projected/8d03bf36-9222-4d14-b16d-5f31b197f11a-kube-api-access-ndr87\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.022434 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022075 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-cnibin\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.022434 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022091 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-var-lib-kubelet\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.022434 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022134 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-host-run-netns\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.022434 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022159 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1f2cee13-cf01-40a3-993d-f9a41ddeae81-host-slash\") pod \"iptables-alerter-j6gfs\" (UID: \"1f2cee13-cf01-40a3-993d-f9a41ddeae81\") " pod="openshift-network-operator/iptables-alerter-j6gfs" Apr 16 16:47:46.022434 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022186 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-var-lib-kubelet\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.022434 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022197 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-host-run-k8s-cni-cncf-io\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.022434 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022221 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-etc-sysctl-d\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.022434 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022219 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-host-run-netns\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.022434 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022233 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1f2cee13-cf01-40a3-993d-f9a41ddeae81-host-slash\") pod \"iptables-alerter-j6gfs\" (UID: \"1f2cee13-cf01-40a3-993d-f9a41ddeae81\") " pod="openshift-network-operator/iptables-alerter-j6gfs" Apr 16 16:47:46.022434 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022254 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-system-cni-dir\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.022434 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022268 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-host-run-k8s-cni-cncf-io\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.022434 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022278 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-host-var-lib-kubelet\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.023283 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022302 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-hostroot\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.023283 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022314 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-system-cni-dir\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.023283 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022329 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8d03bf36-9222-4d14-b16d-5f31b197f11a-multus-daemon-config\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.023283 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022337 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-etc-sysctl-d\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.023283 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022354 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9227133b-0000-4b75-818e-9c2ed2ffe214-tmp\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.023283 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022352 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-host-var-lib-kubelet\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.023283 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022369 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-hostroot\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.023283 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022386 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-etc-systemd\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.023283 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022401 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8d03bf36-9222-4d14-b16d-5f31b197f11a-cni-binary-copy\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.023283 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022413 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-host\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.023283 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022439 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-multus-conf-dir\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.023283 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022445 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-etc-systemd\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.023283 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022460 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-sys\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.023283 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022467 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-host\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.023283 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022490 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-lib-modules\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.023283 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022502 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-sys\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.023283 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022533 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1f2cee13-cf01-40a3-993d-f9a41ddeae81-iptables-alerter-script\") pod \"iptables-alerter-j6gfs\" (UID: \"1f2cee13-cf01-40a3-993d-f9a41ddeae81\") " pod="openshift-network-operator/iptables-alerter-j6gfs" Apr 16 16:47:46.023283 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022539 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-multus-conf-dir\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.024051 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022562 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vghl6\" (UniqueName: \"kubernetes.io/projected/1f2cee13-cf01-40a3-993d-f9a41ddeae81-kube-api-access-vghl6\") pod \"iptables-alerter-j6gfs\" (UID: \"1f2cee13-cf01-40a3-993d-f9a41ddeae81\") " pod="openshift-network-operator/iptables-alerter-j6gfs" Apr 16 16:47:46.024051 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022591 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-multus-socket-dir-parent\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.024051 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022612 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-lib-modules\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.024051 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022616 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/49be5da9-85b8-4a24-8fec-db2c506efbbc-konnectivity-ca\") pod \"konnectivity-agent-d4xtv\" (UID: \"49be5da9-85b8-4a24-8fec-db2c506efbbc\") " pod="kube-system/konnectivity-agent-d4xtv" Apr 16 16:47:46.024051 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022654 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/49be5da9-85b8-4a24-8fec-db2c506efbbc-agent-certs\") pod \"konnectivity-agent-d4xtv\" (UID: \"49be5da9-85b8-4a24-8fec-db2c506efbbc\") " pod="kube-system/konnectivity-agent-d4xtv" Apr 16 16:47:46.024051 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022681 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-etc-sysctl-conf\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.024051 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022707 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mcc4\" (UniqueName: \"kubernetes.io/projected/9227133b-0000-4b75-818e-9c2ed2ffe214-kube-api-access-5mcc4\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.024051 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022739 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-etc-sysconfig\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.024051 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022764 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-run\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.024051 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022837 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-run\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.024051 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022856 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8d03bf36-9222-4d14-b16d-5f31b197f11a-multus-daemon-config\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.024051 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022865 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8d03bf36-9222-4d14-b16d-5f31b197f11a-multus-socket-dir-parent\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.024051 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022922 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-etc-sysconfig\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.024051 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.022983 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9227133b-0000-4b75-818e-9c2ed2ffe214-etc-sysctl-conf\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.024051 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.023022 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1f2cee13-cf01-40a3-993d-f9a41ddeae81-iptables-alerter-script\") pod \"iptables-alerter-j6gfs\" (UID: \"1f2cee13-cf01-40a3-993d-f9a41ddeae81\") " pod="openshift-network-operator/iptables-alerter-j6gfs" Apr 16 16:47:46.024051 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.023139 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/49be5da9-85b8-4a24-8fec-db2c506efbbc-konnectivity-ca\") pod \"konnectivity-agent-d4xtv\" (UID: \"49be5da9-85b8-4a24-8fec-db2c506efbbc\") " pod="kube-system/konnectivity-agent-d4xtv" Apr 16 16:47:46.024719 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.024365 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9227133b-0000-4b75-818e-9c2ed2ffe214-etc-tuned\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.025232 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.025213 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/49be5da9-85b8-4a24-8fec-db2c506efbbc-agent-certs\") pod \"konnectivity-agent-d4xtv\" (UID: \"49be5da9-85b8-4a24-8fec-db2c506efbbc\") " pod="kube-system/konnectivity-agent-d4xtv" Apr 16 16:47:46.025631 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.025605 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9227133b-0000-4b75-818e-9c2ed2ffe214-tmp\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.030087 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.030055 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndr87\" (UniqueName: \"kubernetes.io/projected/8d03bf36-9222-4d14-b16d-5f31b197f11a-kube-api-access-ndr87\") pod \"multus-nwmhb\" (UID: \"8d03bf36-9222-4d14-b16d-5f31b197f11a\") " pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.030194 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.030105 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vghl6\" (UniqueName: \"kubernetes.io/projected/1f2cee13-cf01-40a3-993d-f9a41ddeae81-kube-api-access-vghl6\") pod \"iptables-alerter-j6gfs\" (UID: \"1f2cee13-cf01-40a3-993d-f9a41ddeae81\") " pod="openshift-network-operator/iptables-alerter-j6gfs" Apr 16 16:47:46.030272 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.030251 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mcc4\" (UniqueName: \"kubernetes.io/projected/9227133b-0000-4b75-818e-9c2ed2ffe214-kube-api-access-5mcc4\") pod \"tuned-7p8g2\" (UID: \"9227133b-0000-4b75-818e-9c2ed2ffe214\") " pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.106316 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.106270 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ptmqw" Apr 16 16:47:46.114346 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.114322 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j8nrt" Apr 16 16:47:46.123244 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.123207 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:47:46.129942 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.129918 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" Apr 16 16:47:46.136598 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.136573 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-brchx" Apr 16 16:47:46.148410 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.148256 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-d4xtv" Apr 16 16:47:46.157139 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.157105 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nwmhb" Apr 16 16:47:46.162770 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.162748 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" Apr 16 16:47:46.168308 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.168290 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-j6gfs" Apr 16 16:47:46.425356 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.425291 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs\") pod \"network-metrics-daemon-5rwjz\" (UID: \"65f280f9-caf6-429e-ac03-31bd647a05b6\") " pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:47:46.425508 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:46.425410 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:46.425508 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:46.425485 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs podName:65f280f9-caf6-429e-ac03-31bd647a05b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:47.425457335 +0000 UTC m=+4.095634109 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs") pod "network-metrics-daemon-5rwjz" (UID: "65f280f9-caf6-429e-ac03-31bd647a05b6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:46.526470 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.526437 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2x5p\" (UniqueName: \"kubernetes.io/projected/018fd32e-3479-4227-9d81-8a232b27fc2b-kube-api-access-h2x5p\") pod \"network-check-target-ts7hl\" (UID: \"018fd32e-3479-4227-9d81-8a232b27fc2b\") " pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:47:46.526631 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:46.526610 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:47:46.526631 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:46.526627 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:47:46.526703 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:46.526637 2568 projected.go:194] Error preparing data for projected volume kube-api-access-h2x5p for pod openshift-network-diagnostics/network-check-target-ts7hl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:46.526703 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:46.526687 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/018fd32e-3479-4227-9d81-8a232b27fc2b-kube-api-access-h2x5p podName:018fd32e-3479-4227-9d81-8a232b27fc2b nodeName:}" failed. No retries permitted until 2026-04-16 16:47:47.526674441 +0000 UTC m=+4.196851197 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-h2x5p" (UniqueName: "kubernetes.io/projected/018fd32e-3479-4227-9d81-8a232b27fc2b-kube-api-access-h2x5p") pod "network-check-target-ts7hl" (UID: "018fd32e-3479-4227-9d81-8a232b27fc2b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:46.671372 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:46.671336 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c447c29_c470_4314_becc_ad24580321c8.slice/crio-78e25a89486ba3386f302706a9f2c47dfc996aa43705abb7954c675ea3ede3d8 WatchSource:0}: Error finding container 78e25a89486ba3386f302706a9f2c47dfc996aa43705abb7954c675ea3ede3d8: Status 404 returned error can't find the container with id 78e25a89486ba3386f302706a9f2c47dfc996aa43705abb7954c675ea3ede3d8 Apr 16 16:47:46.675117 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:46.675033 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40202585_b938_4a5c_bde8_ac1c5ea40044.slice/crio-b7502388b65b289c2ff5ec0f3b715bb663498bf3fcb615fdcd2a6d1a7d4cd3f4 WatchSource:0}: Error finding container b7502388b65b289c2ff5ec0f3b715bb663498bf3fcb615fdcd2a6d1a7d4cd3f4: Status 404 returned error can't find the container with id b7502388b65b289c2ff5ec0f3b715bb663498bf3fcb615fdcd2a6d1a7d4cd3f4 Apr 16 16:47:46.678301 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:46.678248 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ee9ccfb_aa99_4334_93d4_21dad3568cda.slice/crio-22f917ba3a6b29e02035b4c86d8ffe62553ae6227a8dda4989b5e4d3f6a2edc6 WatchSource:0}: Error finding container 22f917ba3a6b29e02035b4c86d8ffe62553ae6227a8dda4989b5e4d3f6a2edc6: Status 404 returned error can't find the container with id 22f917ba3a6b29e02035b4c86d8ffe62553ae6227a8dda4989b5e4d3f6a2edc6 Apr 16 16:47:46.679578 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:46.679548 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod482c17e3_998c_48aa_b158_037aa6ebf920.slice/crio-8c694cd7402d9f9c195ace9f090f8d8f5a1b2087ea857994e9c4469a93fa3b8f WatchSource:0}: Error finding container 8c694cd7402d9f9c195ace9f090f8d8f5a1b2087ea857994e9c4469a93fa3b8f: Status 404 returned error can't find the container with id 8c694cd7402d9f9c195ace9f090f8d8f5a1b2087ea857994e9c4469a93fa3b8f Apr 16 16:47:46.680394 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:46.680361 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f2cee13_cf01_40a3_993d_f9a41ddeae81.slice/crio-2e1193a6ae90619a690664cf8338ba4f6d29f2fed146249db35361da5786e50d WatchSource:0}: Error finding container 2e1193a6ae90619a690664cf8338ba4f6d29f2fed146249db35361da5786e50d: Status 404 returned error can't find the container with id 2e1193a6ae90619a690664cf8338ba4f6d29f2fed146249db35361da5786e50d Apr 16 16:47:46.681742 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:46.681720 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49be5da9_85b8_4a24_8fec_db2c506efbbc.slice/crio-520978c5be8d87d06a8e24abc235a4bf3782493da69ad00ae4adb233e477f500 WatchSource:0}: Error finding container 520978c5be8d87d06a8e24abc235a4bf3782493da69ad00ae4adb233e477f500: Status 404 returned error can't find the container with id 520978c5be8d87d06a8e24abc235a4bf3782493da69ad00ae4adb233e477f500 Apr 16 16:47:46.682422 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:46.682376 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d03bf36_9222_4d14_b16d_5f31b197f11a.slice/crio-17208709ef7267f5c3f835c67cf6799e6d09e494c9f5202b52f559ab4404e0a9 WatchSource:0}: Error finding container 17208709ef7267f5c3f835c67cf6799e6d09e494c9f5202b52f559ab4404e0a9: Status 404 returned error can't find the container with id 17208709ef7267f5c3f835c67cf6799e6d09e494c9f5202b52f559ab4404e0a9 Apr 16 16:47:46.684401 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:46.683664 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc39d9cc0_8cac_46bd_968d_baba878cd954.slice/crio-620a559d8b225990855e129e345be2d071b394d6b063fcba9f03800bbf5119a6 WatchSource:0}: Error finding container 620a559d8b225990855e129e345be2d071b394d6b063fcba9f03800bbf5119a6: Status 404 returned error can't find the container with id 620a559d8b225990855e129e345be2d071b394d6b063fcba9f03800bbf5119a6 Apr 16 16:47:46.684401 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:47:46.684262 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9227133b_0000_4b75_818e_9c2ed2ffe214.slice/crio-29ec4f68966feb747f743b029e2820e8efc0e19830139b4a46eb97ba5d77f610 WatchSource:0}: Error finding container 29ec4f68966feb747f743b029e2820e8efc0e19830139b4a46eb97ba5d77f610: Status 404 returned error can't find the container with id 29ec4f68966feb747f743b029e2820e8efc0e19830139b4a46eb97ba5d77f610 Apr 16 16:47:46.854027 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.853992 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:42:44 +0000 UTC" deadline="2027-11-01 22:46:20.648339128 +0000 UTC" Apr 16 16:47:46.854027 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.854022 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13541h58m33.794319312s" Apr 16 16:47:46.939901 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.939819 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" event={"ID":"9227133b-0000-4b75-818e-9c2ed2ffe214","Type":"ContainerStarted","Data":"29ec4f68966feb747f743b029e2820e8efc0e19830139b4a46eb97ba5d77f610"} Apr 16 16:47:46.940809 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.940783 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j8nrt" event={"ID":"c39d9cc0-8cac-46bd-968d-baba878cd954","Type":"ContainerStarted","Data":"620a559d8b225990855e129e345be2d071b394d6b063fcba9f03800bbf5119a6"} Apr 16 16:47:46.947426 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.947401 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nwmhb" event={"ID":"8d03bf36-9222-4d14-b16d-5f31b197f11a","Type":"ContainerStarted","Data":"17208709ef7267f5c3f835c67cf6799e6d09e494c9f5202b52f559ab4404e0a9"} Apr 16 16:47:46.948571 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.948543 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-j6gfs" event={"ID":"1f2cee13-cf01-40a3-993d-f9a41ddeae81","Type":"ContainerStarted","Data":"2e1193a6ae90619a690664cf8338ba4f6d29f2fed146249db35361da5786e50d"} Apr 16 16:47:46.949372 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.949352 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brchx" event={"ID":"482c17e3-998c-48aa-b158-037aa6ebf920","Type":"ContainerStarted","Data":"8c694cd7402d9f9c195ace9f090f8d8f5a1b2087ea857994e9c4469a93fa3b8f"} Apr 16 16:47:46.950322 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.950300 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ptmqw" event={"ID":"0c447c29-c470-4314-becc-ad24580321c8","Type":"ContainerStarted","Data":"78e25a89486ba3386f302706a9f2c47dfc996aa43705abb7954c675ea3ede3d8"} Apr 16 16:47:46.951657 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.951637 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-63.ec2.internal" event={"ID":"efe8b482663d36b8cd0f3c122fed91e1","Type":"ContainerStarted","Data":"975f82b01900ebeaa8e7608749722b5dfec71b7a7c4fee48f989e30747fb63af"} Apr 16 16:47:46.954920 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.954895 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-d4xtv" event={"ID":"49be5da9-85b8-4a24-8fec-db2c506efbbc","Type":"ContainerStarted","Data":"520978c5be8d87d06a8e24abc235a4bf3782493da69ad00ae4adb233e477f500"} Apr 16 16:47:46.955795 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.955777 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" event={"ID":"40202585-b938-4a5c-bde8-ac1c5ea40044","Type":"ContainerStarted","Data":"b7502388b65b289c2ff5ec0f3b715bb663498bf3fcb615fdcd2a6d1a7d4cd3f4"} Apr 16 16:47:46.956629 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.956610 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" event={"ID":"6ee9ccfb-aa99-4334-93d4-21dad3568cda","Type":"ContainerStarted","Data":"22f917ba3a6b29e02035b4c86d8ffe62553ae6227a8dda4989b5e4d3f6a2edc6"} Apr 16 16:47:46.967073 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:46.967028 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-63.ec2.internal" podStartSLOduration=1.967014028 podStartE2EDuration="1.967014028s" podCreationTimestamp="2026-04-16 16:47:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:47:46.966348338 +0000 UTC m=+3.636525116" watchObservedRunningTime="2026-04-16 16:47:46.967014028 +0000 UTC m=+3.637190807" Apr 16 16:47:47.437484 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:47.436828 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs\") pod \"network-metrics-daemon-5rwjz\" (UID: \"65f280f9-caf6-429e-ac03-31bd647a05b6\") " pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:47:47.437484 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:47.437004 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:47.437484 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:47.437066 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs podName:65f280f9-caf6-429e-ac03-31bd647a05b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:49.437047657 +0000 UTC m=+6.107224418 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs") pod "network-metrics-daemon-5rwjz" (UID: "65f280f9-caf6-429e-ac03-31bd647a05b6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:47.537861 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:47.537816 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2x5p\" (UniqueName: \"kubernetes.io/projected/018fd32e-3479-4227-9d81-8a232b27fc2b-kube-api-access-h2x5p\") pod \"network-check-target-ts7hl\" (UID: \"018fd32e-3479-4227-9d81-8a232b27fc2b\") " pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:47:47.538032 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:47.537988 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:47:47.538032 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:47.538006 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:47:47.538032 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:47.538019 2568 projected.go:194] Error preparing data for projected volume kube-api-access-h2x5p for pod openshift-network-diagnostics/network-check-target-ts7hl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:47.538191 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:47.538078 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/018fd32e-3479-4227-9d81-8a232b27fc2b-kube-api-access-h2x5p podName:018fd32e-3479-4227-9d81-8a232b27fc2b nodeName:}" failed. No retries permitted until 2026-04-16 16:47:49.538058775 +0000 UTC m=+6.208235549 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-h2x5p" (UniqueName: "kubernetes.io/projected/018fd32e-3479-4227-9d81-8a232b27fc2b-kube-api-access-h2x5p") pod "network-check-target-ts7hl" (UID: "018fd32e-3479-4227-9d81-8a232b27fc2b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:47.574948 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:47.574912 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-4bqrw"] Apr 16 16:47:47.577273 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:47.576922 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:47:47.577273 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:47.576996 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4bqrw" podUID="5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77" Apr 16 16:47:47.638848 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:47.638708 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-kubelet-config\") pod \"global-pull-secret-syncer-4bqrw\" (UID: \"5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77\") " pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:47:47.638848 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:47.638758 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-dbus\") pod \"global-pull-secret-syncer-4bqrw\" (UID: \"5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77\") " pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:47:47.638848 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:47.638786 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-original-pull-secret\") pod \"global-pull-secret-syncer-4bqrw\" (UID: \"5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77\") " pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:47:47.740376 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:47.739675 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-kubelet-config\") pod \"global-pull-secret-syncer-4bqrw\" (UID: \"5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77\") " pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:47:47.740376 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:47.739723 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-dbus\") pod \"global-pull-secret-syncer-4bqrw\" (UID: \"5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77\") " pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:47:47.740376 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:47.739752 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-original-pull-secret\") pod \"global-pull-secret-syncer-4bqrw\" (UID: \"5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77\") " pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:47:47.740376 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:47.739924 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:47:47.740376 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:47.739983 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-original-pull-secret podName:5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:48.239964549 +0000 UTC m=+4.910141319 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-original-pull-secret") pod "global-pull-secret-syncer-4bqrw" (UID: "5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:47:47.740376 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:47.740211 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-kubelet-config\") pod \"global-pull-secret-syncer-4bqrw\" (UID: \"5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77\") " pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:47:47.740376 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:47.740335 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-dbus\") pod \"global-pull-secret-syncer-4bqrw\" (UID: \"5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77\") " pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:47:47.931072 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:47.931040 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:47:47.931605 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:47.931167 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ts7hl" podUID="018fd32e-3479-4227-9d81-8a232b27fc2b" Apr 16 16:47:47.932551 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:47.932512 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:47:47.932662 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:47.932639 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5rwjz" podUID="65f280f9-caf6-429e-ac03-31bd647a05b6" Apr 16 16:47:47.968626 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:47.968548 2568 generic.go:358] "Generic (PLEG): container finished" podID="e3c2b6dbe3e0a746e2afaa0da10e7815" containerID="d8e8b9ea4a507907c395d6f99bf1af4b5824abf9f9dadd03823ed7aa37d830f9" exitCode=0 Apr 16 16:47:47.969587 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:47.969559 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-63.ec2.internal" event={"ID":"e3c2b6dbe3e0a746e2afaa0da10e7815","Type":"ContainerDied","Data":"d8e8b9ea4a507907c395d6f99bf1af4b5824abf9f9dadd03823ed7aa37d830f9"} Apr 16 16:47:48.244608 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:48.243925 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-original-pull-secret\") pod \"global-pull-secret-syncer-4bqrw\" (UID: \"5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77\") " pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:47:48.244608 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:48.244139 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:47:48.244608 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:48.244199 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-original-pull-secret podName:5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:49.244182645 +0000 UTC m=+5.914359417 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-original-pull-secret") pod "global-pull-secret-syncer-4bqrw" (UID: "5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:47:48.929128 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:48.929099 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:47:48.929312 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:48.929220 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4bqrw" podUID="5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77" Apr 16 16:47:48.979551 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:48.979348 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-63.ec2.internal" event={"ID":"e3c2b6dbe3e0a746e2afaa0da10e7815","Type":"ContainerStarted","Data":"e14d73ce38ccf5eef42d35413ad22634c4d3f831dfae72e13329318f2bff423f"} Apr 16 16:47:48.995295 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:48.995239 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-63.ec2.internal" podStartSLOduration=3.995220123 podStartE2EDuration="3.995220123s" podCreationTimestamp="2026-04-16 16:47:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:47:48.994870283 +0000 UTC m=+5.665047062" watchObservedRunningTime="2026-04-16 16:47:48.995220123 +0000 UTC m=+5.665396905" Apr 16 16:47:49.254055 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:49.253809 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-original-pull-secret\") pod \"global-pull-secret-syncer-4bqrw\" (UID: \"5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77\") " pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:47:49.254055 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:49.253959 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:47:49.254055 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:49.254021 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-original-pull-secret podName:5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:51.254004955 +0000 UTC m=+7.924181725 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-original-pull-secret") pod "global-pull-secret-syncer-4bqrw" (UID: "5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:47:49.455710 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:49.455144 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs\") pod \"network-metrics-daemon-5rwjz\" (UID: \"65f280f9-caf6-429e-ac03-31bd647a05b6\") " pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:47:49.455710 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:49.455280 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:49.455710 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:49.455341 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs podName:65f280f9-caf6-429e-ac03-31bd647a05b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:53.455322218 +0000 UTC m=+10.125498981 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs") pod "network-metrics-daemon-5rwjz" (UID: "65f280f9-caf6-429e-ac03-31bd647a05b6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:49.556611 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:49.555911 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2x5p\" (UniqueName: \"kubernetes.io/projected/018fd32e-3479-4227-9d81-8a232b27fc2b-kube-api-access-h2x5p\") pod \"network-check-target-ts7hl\" (UID: \"018fd32e-3479-4227-9d81-8a232b27fc2b\") " pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:47:49.556611 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:49.556064 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:47:49.556611 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:49.556086 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:47:49.556611 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:49.556097 2568 projected.go:194] Error preparing data for projected volume kube-api-access-h2x5p for pod openshift-network-diagnostics/network-check-target-ts7hl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:49.556611 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:49.556154 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/018fd32e-3479-4227-9d81-8a232b27fc2b-kube-api-access-h2x5p podName:018fd32e-3479-4227-9d81-8a232b27fc2b nodeName:}" failed. No retries permitted until 2026-04-16 16:47:53.55613667 +0000 UTC m=+10.226313436 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-h2x5p" (UniqueName: "kubernetes.io/projected/018fd32e-3479-4227-9d81-8a232b27fc2b-kube-api-access-h2x5p") pod "network-check-target-ts7hl" (UID: "018fd32e-3479-4227-9d81-8a232b27fc2b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:49.929529 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:49.929421 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:47:49.929529 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:49.929475 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:47:49.929767 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:49.929595 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5rwjz" podUID="65f280f9-caf6-429e-ac03-31bd647a05b6" Apr 16 16:47:49.929767 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:49.929682 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ts7hl" podUID="018fd32e-3479-4227-9d81-8a232b27fc2b" Apr 16 16:47:50.929218 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:50.929182 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:47:50.929723 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:50.929321 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4bqrw" podUID="5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77" Apr 16 16:47:51.273340 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:51.273169 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-original-pull-secret\") pod \"global-pull-secret-syncer-4bqrw\" (UID: \"5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77\") " pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:47:51.273535 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:51.273374 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:47:51.273535 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:51.273460 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-original-pull-secret podName:5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:55.273440212 +0000 UTC m=+11.943616973 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-original-pull-secret") pod "global-pull-secret-syncer-4bqrw" (UID: "5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:47:51.933608 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:51.933556 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:47:51.934123 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:51.933671 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ts7hl" podUID="018fd32e-3479-4227-9d81-8a232b27fc2b" Apr 16 16:47:51.934123 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:51.933556 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:47:51.934243 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:51.934143 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5rwjz" podUID="65f280f9-caf6-429e-ac03-31bd647a05b6" Apr 16 16:47:52.929496 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:52.929460 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:47:52.929676 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:52.929611 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4bqrw" podUID="5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77" Apr 16 16:47:53.494050 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:53.493725 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs\") pod \"network-metrics-daemon-5rwjz\" (UID: \"65f280f9-caf6-429e-ac03-31bd647a05b6\") " pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:47:53.494050 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:53.493898 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:53.494050 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:53.493966 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs podName:65f280f9-caf6-429e-ac03-31bd647a05b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:01.493944801 +0000 UTC m=+18.164121558 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs") pod "network-metrics-daemon-5rwjz" (UID: "65f280f9-caf6-429e-ac03-31bd647a05b6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:53.594201 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:53.594159 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2x5p\" (UniqueName: \"kubernetes.io/projected/018fd32e-3479-4227-9d81-8a232b27fc2b-kube-api-access-h2x5p\") pod \"network-check-target-ts7hl\" (UID: \"018fd32e-3479-4227-9d81-8a232b27fc2b\") " pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:47:53.594389 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:53.594365 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:47:53.594465 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:53.594395 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:47:53.594465 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:53.594411 2568 projected.go:194] Error preparing data for projected volume kube-api-access-h2x5p for pod openshift-network-diagnostics/network-check-target-ts7hl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:53.594580 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:53.594478 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/018fd32e-3479-4227-9d81-8a232b27fc2b-kube-api-access-h2x5p podName:018fd32e-3479-4227-9d81-8a232b27fc2b nodeName:}" failed. No retries permitted until 2026-04-16 16:48:01.59445929 +0000 UTC m=+18.264636053 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-h2x5p" (UniqueName: "kubernetes.io/projected/018fd32e-3479-4227-9d81-8a232b27fc2b-kube-api-access-h2x5p") pod "network-check-target-ts7hl" (UID: "018fd32e-3479-4227-9d81-8a232b27fc2b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:53.930608 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:53.930129 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:47:53.930608 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:53.930242 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ts7hl" podUID="018fd32e-3479-4227-9d81-8a232b27fc2b" Apr 16 16:47:53.930608 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:53.930300 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:47:53.930608 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:53.930408 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5rwjz" podUID="65f280f9-caf6-429e-ac03-31bd647a05b6" Apr 16 16:47:54.930164 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:54.930128 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:47:54.930624 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:54.930261 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4bqrw" podUID="5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77" Apr 16 16:47:55.310781 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:55.310738 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-original-pull-secret\") pod \"global-pull-secret-syncer-4bqrw\" (UID: \"5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77\") " pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:47:55.310961 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:55.310884 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:47:55.311020 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:55.310965 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-original-pull-secret podName:5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:03.310943737 +0000 UTC m=+19.981120517 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-original-pull-secret") pod "global-pull-secret-syncer-4bqrw" (UID: "5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:47:55.932934 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:55.932899 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:47:55.933381 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:55.932899 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:47:55.933381 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:55.933036 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ts7hl" podUID="018fd32e-3479-4227-9d81-8a232b27fc2b" Apr 16 16:47:55.933381 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:55.933126 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5rwjz" podUID="65f280f9-caf6-429e-ac03-31bd647a05b6" Apr 16 16:47:56.929313 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:56.929272 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:47:56.929485 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:56.929418 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4bqrw" podUID="5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77" Apr 16 16:47:57.929858 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:57.929739 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:47:57.930293 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:57.929870 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ts7hl" podUID="018fd32e-3479-4227-9d81-8a232b27fc2b" Apr 16 16:47:57.930293 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:57.929923 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:47:57.930293 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:57.930055 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5rwjz" podUID="65f280f9-caf6-429e-ac03-31bd647a05b6" Apr 16 16:47:58.929691 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:58.929656 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:47:58.929880 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:58.929782 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4bqrw" podUID="5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77" Apr 16 16:47:59.929691 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:59.929665 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:47:59.929856 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:59.929779 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ts7hl" podUID="018fd32e-3479-4227-9d81-8a232b27fc2b" Apr 16 16:47:59.929934 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:47:59.929868 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:47:59.930252 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:47:59.930015 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5rwjz" podUID="65f280f9-caf6-429e-ac03-31bd647a05b6" Apr 16 16:48:00.929598 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:00.929556 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:48:00.929829 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:00.929686 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4bqrw" podUID="5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77" Apr 16 16:48:01.558260 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:01.558218 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs\") pod \"network-metrics-daemon-5rwjz\" (UID: \"65f280f9-caf6-429e-ac03-31bd647a05b6\") " pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:48:01.558737 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:01.558399 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:01.558737 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:01.558465 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs podName:65f280f9-caf6-429e-ac03-31bd647a05b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:17.558443836 +0000 UTC m=+34.228620598 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs") pod "network-metrics-daemon-5rwjz" (UID: "65f280f9-caf6-429e-ac03-31bd647a05b6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:01.659382 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:01.659349 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2x5p\" (UniqueName: \"kubernetes.io/projected/018fd32e-3479-4227-9d81-8a232b27fc2b-kube-api-access-h2x5p\") pod \"network-check-target-ts7hl\" (UID: \"018fd32e-3479-4227-9d81-8a232b27fc2b\") " pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:48:01.659561 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:01.659499 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:48:01.659561 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:01.659529 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:48:01.659561 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:01.659541 2568 projected.go:194] Error preparing data for projected volume kube-api-access-h2x5p for pod openshift-network-diagnostics/network-check-target-ts7hl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:01.659686 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:01.659598 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/018fd32e-3479-4227-9d81-8a232b27fc2b-kube-api-access-h2x5p podName:018fd32e-3479-4227-9d81-8a232b27fc2b nodeName:}" failed. No retries permitted until 2026-04-16 16:48:17.659582327 +0000 UTC m=+34.329759083 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-h2x5p" (UniqueName: "kubernetes.io/projected/018fd32e-3479-4227-9d81-8a232b27fc2b-kube-api-access-h2x5p") pod "network-check-target-ts7hl" (UID: "018fd32e-3479-4227-9d81-8a232b27fc2b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:01.929703 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:01.929622 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:48:01.929703 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:01.929665 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:48:01.929925 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:01.929747 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ts7hl" podUID="018fd32e-3479-4227-9d81-8a232b27fc2b" Apr 16 16:48:01.929925 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:01.929895 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5rwjz" podUID="65f280f9-caf6-429e-ac03-31bd647a05b6" Apr 16 16:48:02.929657 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:02.929619 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:48:02.930086 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:02.929761 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4bqrw" podUID="5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77" Apr 16 16:48:03.370773 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:03.370737 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-original-pull-secret\") pod \"global-pull-secret-syncer-4bqrw\" (UID: \"5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77\") " pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:48:03.370934 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:03.370890 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:48:03.370972 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:03.370959 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-original-pull-secret podName:5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:19.370943044 +0000 UTC m=+36.041119804 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-original-pull-secret") pod "global-pull-secret-syncer-4bqrw" (UID: "5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:48:03.929374 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:03.929199 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:48:03.929473 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:03.929294 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:48:03.930636 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:03.930539 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ts7hl" podUID="018fd32e-3479-4227-9d81-8a232b27fc2b" Apr 16 16:48:03.931289 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:03.930646 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5rwjz" podUID="65f280f9-caf6-429e-ac03-31bd647a05b6" Apr 16 16:48:04.006666 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:04.006428 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" event={"ID":"9227133b-0000-4b75-818e-9c2ed2ffe214","Type":"ContainerStarted","Data":"ecb8f8581d39cb4250a9bbc6cfe0785ef254fcf8ad267fc45dd4bd6a8f912d4f"} Apr 16 16:48:04.007737 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:04.007713 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j8nrt" event={"ID":"c39d9cc0-8cac-46bd-968d-baba878cd954","Type":"ContainerStarted","Data":"3d5f249280990c3470bec4119e06dd6ee3dfcc155a84d0d19c1d82c561efa573"} Apr 16 16:48:04.009157 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:04.009134 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nwmhb" event={"ID":"8d03bf36-9222-4d14-b16d-5f31b197f11a","Type":"ContainerStarted","Data":"8d1d5ef4b4b74d843c6a5c988a7d011053be3836570473a9de4846c75aab94ad"} Apr 16 16:48:04.010312 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:04.010262 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brchx" event={"ID":"482c17e3-998c-48aa-b158-037aa6ebf920","Type":"ContainerStarted","Data":"e14f1be1b04d46ad7c3a6f7e7f1a9b2b7001f06db53dbb77aa759dbf72366bba"} Apr 16 16:48:04.011269 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:04.011247 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ptmqw" event={"ID":"0c447c29-c470-4314-becc-ad24580321c8","Type":"ContainerStarted","Data":"6775a8667cd3fb55eeb3fc63103ade17522a75cf020971aec2e82b3253f3d3ad"} Apr 16 16:48:04.012336 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:04.012309 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-d4xtv" event={"ID":"49be5da9-85b8-4a24-8fec-db2c506efbbc","Type":"ContainerStarted","Data":"136530bb77d2c4df3c3eb82955d9b4f4ac224e0c891b87ad71980cc4f013af1a"} Apr 16 16:48:04.013720 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:04.013700 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" event={"ID":"40202585-b938-4a5c-bde8-ac1c5ea40044","Type":"ContainerStarted","Data":"1fc8b9377b3759165987ea8b1d83342d7a7540f59509052624748a0ae73cbb27"} Apr 16 16:48:04.013791 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:04.013726 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" event={"ID":"40202585-b938-4a5c-bde8-ac1c5ea40044","Type":"ContainerStarted","Data":"b1fd721dc7c934f6423f36300d6c8a92b753c898c93c8052962612b8a76f0485"} Apr 16 16:48:04.014733 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:04.014714 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" event={"ID":"6ee9ccfb-aa99-4334-93d4-21dad3568cda","Type":"ContainerStarted","Data":"5cec83be9b7e62117478e99b44cc718980f987ebe8752c2b258e72fd95faae1c"} Apr 16 16:48:04.023415 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:04.023368 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7p8g2" podStartSLOduration=3.079252007 podStartE2EDuration="20.023353222s" podCreationTimestamp="2026-04-16 16:47:44 +0000 UTC" firstStartedPulling="2026-04-16 16:47:46.686140035 +0000 UTC m=+3.356316797" lastFinishedPulling="2026-04-16 16:48:03.630241248 +0000 UTC m=+20.300418012" observedRunningTime="2026-04-16 16:48:04.023320488 +0000 UTC m=+20.693497267" watchObservedRunningTime="2026-04-16 16:48:04.023353222 +0000 UTC m=+20.693530006" Apr 16 16:48:04.038411 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:04.038370 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ptmqw" podStartSLOduration=4.081460663 podStartE2EDuration="21.038354543s" podCreationTimestamp="2026-04-16 16:47:43 +0000 UTC" firstStartedPulling="2026-04-16 16:47:46.673346392 +0000 UTC m=+3.343523149" lastFinishedPulling="2026-04-16 16:48:03.630240267 +0000 UTC m=+20.300417029" observedRunningTime="2026-04-16 16:48:04.038024668 +0000 UTC m=+20.708201447" watchObservedRunningTime="2026-04-16 16:48:04.038354543 +0000 UTC m=+20.708531322" Apr 16 16:48:04.071755 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:04.071714 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-d4xtv" podStartSLOduration=7.500650632 podStartE2EDuration="20.071699648s" podCreationTimestamp="2026-04-16 16:47:44 +0000 UTC" firstStartedPulling="2026-04-16 16:47:46.684241319 +0000 UTC m=+3.354418083" lastFinishedPulling="2026-04-16 16:47:59.255290324 +0000 UTC m=+15.925467099" observedRunningTime="2026-04-16 16:48:04.07169062 +0000 UTC m=+20.741867398" watchObservedRunningTime="2026-04-16 16:48:04.071699648 +0000 UTC m=+20.741876427" Apr 16 16:48:04.071906 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:04.071886 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-j8nrt" podStartSLOduration=8.501782727 podStartE2EDuration="21.071881512s" podCreationTimestamp="2026-04-16 16:47:43 +0000 UTC" firstStartedPulling="2026-04-16 16:47:46.68532796 +0000 UTC m=+3.355504717" lastFinishedPulling="2026-04-16 16:47:59.255426731 +0000 UTC m=+15.925603502" observedRunningTime="2026-04-16 16:48:04.052311372 +0000 UTC m=+20.722488162" watchObservedRunningTime="2026-04-16 16:48:04.071881512 +0000 UTC m=+20.742058289" Apr 16 16:48:04.112753 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:04.112702 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nwmhb" podStartSLOduration=3.12729017 podStartE2EDuration="20.112686104s" podCreationTimestamp="2026-04-16 16:47:44 +0000 UTC" firstStartedPulling="2026-04-16 16:47:46.684792485 +0000 UTC m=+3.354969253" lastFinishedPulling="2026-04-16 16:48:03.670188411 +0000 UTC m=+20.340365187" observedRunningTime="2026-04-16 16:48:04.11215714 +0000 UTC m=+20.782333920" watchObservedRunningTime="2026-04-16 16:48:04.112686104 +0000 UTC m=+20.782862879" Apr 16 16:48:04.929578 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:04.929541 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:48:04.929738 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:04.929677 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4bqrw" podUID="5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77" Apr 16 16:48:04.987341 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:04.987321 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 16:48:05.017712 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:05.017685 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-j6gfs" event={"ID":"1f2cee13-cf01-40a3-993d-f9a41ddeae81","Type":"ContainerStarted","Data":"dadc7bd3db1c52faad7e2d5f479e8720614847f1d6dc8d328e6bba93426ba346"} Apr 16 16:48:05.018889 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:05.018858 2568 generic.go:358] "Generic (PLEG): container finished" podID="482c17e3-998c-48aa-b158-037aa6ebf920" containerID="e14f1be1b04d46ad7c3a6f7e7f1a9b2b7001f06db53dbb77aa759dbf72366bba" exitCode=0 Apr 16 16:48:05.018971 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:05.018937 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brchx" event={"ID":"482c17e3-998c-48aa-b158-037aa6ebf920","Type":"ContainerDied","Data":"e14f1be1b04d46ad7c3a6f7e7f1a9b2b7001f06db53dbb77aa759dbf72366bba"} Apr 16 16:48:05.021800 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:05.021780 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" event={"ID":"40202585-b938-4a5c-bde8-ac1c5ea40044","Type":"ContainerStarted","Data":"f5606d99386411f16e1a5b1b2ca85bf8bd6401d47b250c7857d1a4c46130217a"} Apr 16 16:48:05.021873 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:05.021804 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" event={"ID":"40202585-b938-4a5c-bde8-ac1c5ea40044","Type":"ContainerStarted","Data":"900a22610f58e1ae76ecb1283d3a02e71c181942b5538d81d9aba0be0d897f16"} Apr 16 16:48:05.021873 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:05.021814 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" event={"ID":"40202585-b938-4a5c-bde8-ac1c5ea40044","Type":"ContainerStarted","Data":"b28ef2d5c0f241b4c2bf54cdbd376c9397decb0b57dfc9046392b271c4c32794"} Apr 16 16:48:05.021873 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:05.021825 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" event={"ID":"40202585-b938-4a5c-bde8-ac1c5ea40044","Type":"ContainerStarted","Data":"651838fa4ef832e8abcb882b91f637096367662a8cfc361396a5ff5f6df7d349"} Apr 16 16:48:05.023313 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:05.023282 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" event={"ID":"6ee9ccfb-aa99-4334-93d4-21dad3568cda","Type":"ContainerStarted","Data":"5ba46c49edec318d7fbfb34c131db11e5d330eac6a230ce7cb3a162ce03a13fc"} Apr 16 16:48:05.031815 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:05.031773 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-j6gfs" podStartSLOduration=4.133244653 podStartE2EDuration="21.031763871s" podCreationTimestamp="2026-04-16 16:47:44 +0000 UTC" firstStartedPulling="2026-04-16 16:47:46.682698023 +0000 UTC m=+3.352874788" lastFinishedPulling="2026-04-16 16:48:03.581217242 +0000 UTC m=+20.251394006" observedRunningTime="2026-04-16 16:48:05.031548498 +0000 UTC m=+21.701725274" watchObservedRunningTime="2026-04-16 16:48:05.031763871 +0000 UTC m=+21.701940650" Apr 16 16:48:05.875467 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:05.875168 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T16:48:04.987336785Z","UUID":"ffc62b7b-8f08-4e92-95b3-d1740d6abd55","Handler":null,"Name":"","Endpoint":""} Apr 16 16:48:05.878456 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:05.878420 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 16:48:05.878456 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:05.878461 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 16:48:05.929395 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:05.929358 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:48:05.929550 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:05.929501 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ts7hl" podUID="018fd32e-3479-4227-9d81-8a232b27fc2b" Apr 16 16:48:05.929620 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:05.929562 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:48:05.929739 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:05.929669 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5rwjz" podUID="65f280f9-caf6-429e-ac03-31bd647a05b6" Apr 16 16:48:06.929472 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:06.929434 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:48:06.929987 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:06.929594 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4bqrw" podUID="5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77" Apr 16 16:48:07.030154 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:07.030117 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" event={"ID":"40202585-b938-4a5c-bde8-ac1c5ea40044","Type":"ContainerStarted","Data":"7465f05f85f3f8357c3fe10f16230d96ac807e44cad5e6ff6dbe08b825de3d02"} Apr 16 16:48:07.032104 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:07.032072 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" event={"ID":"6ee9ccfb-aa99-4334-93d4-21dad3568cda","Type":"ContainerStarted","Data":"6e06f863f624d54dcbd64ae6a63c1d70b6a6ad0496e4fb22e4081721f83d20ab"} Apr 16 16:48:07.051534 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:07.051480 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzx59" podStartSLOduration=3.819849422 podStartE2EDuration="23.051463578s" podCreationTimestamp="2026-04-16 16:47:44 +0000 UTC" firstStartedPulling="2026-04-16 16:47:46.68115957 +0000 UTC m=+3.351336343" lastFinishedPulling="2026-04-16 16:48:05.912773727 +0000 UTC m=+22.582950499" observedRunningTime="2026-04-16 16:48:07.050852792 +0000 UTC m=+23.721029573" watchObservedRunningTime="2026-04-16 16:48:07.051463578 +0000 UTC m=+23.721640362" Apr 16 16:48:07.930199 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:07.930162 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:48:07.930752 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:07.930217 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:48:07.930752 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:07.930293 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ts7hl" podUID="018fd32e-3479-4227-9d81-8a232b27fc2b" Apr 16 16:48:07.930752 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:07.930458 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5rwjz" podUID="65f280f9-caf6-429e-ac03-31bd647a05b6" Apr 16 16:48:07.997717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:07.997680 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-d4xtv" Apr 16 16:48:07.998645 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:07.998622 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-d4xtv" Apr 16 16:48:08.929194 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:08.929067 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:48:08.929301 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:08.929275 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4bqrw" podUID="5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77" Apr 16 16:48:09.039843 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:09.039743 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" event={"ID":"40202585-b938-4a5c-bde8-ac1c5ea40044","Type":"ContainerStarted","Data":"ef1e56563dde825a64f27127c24f50030f49a8984a8274ad84879195982fa4d6"} Apr 16 16:48:09.040946 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:09.040140 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:48:09.040946 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:09.040174 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:48:09.053610 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:09.053580 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:48:09.073964 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:09.073917 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" podStartSLOduration=8.981070296 podStartE2EDuration="26.073902979s" podCreationTimestamp="2026-04-16 16:47:43 +0000 UTC" firstStartedPulling="2026-04-16 16:47:46.677609977 +0000 UTC m=+3.347786739" lastFinishedPulling="2026-04-16 16:48:03.770442658 +0000 UTC m=+20.440619422" observedRunningTime="2026-04-16 16:48:09.073281112 +0000 UTC m=+25.743457892" watchObservedRunningTime="2026-04-16 16:48:09.073902979 +0000 UTC m=+25.744079968" Apr 16 16:48:09.929720 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:09.929682 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:48:09.929916 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:09.929679 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:48:09.929916 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:09.929795 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5rwjz" podUID="65f280f9-caf6-429e-ac03-31bd647a05b6" Apr 16 16:48:09.929916 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:09.929870 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ts7hl" podUID="018fd32e-3479-4227-9d81-8a232b27fc2b" Apr 16 16:48:10.043281 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:10.043249 2568 generic.go:358] "Generic (PLEG): container finished" podID="482c17e3-998c-48aa-b158-037aa6ebf920" containerID="e567f3fce58101f04d10f4c1b338d748357fcbabd5cb125a391016d6874da2d5" exitCode=0 Apr 16 16:48:10.043754 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:10.043342 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brchx" event={"ID":"482c17e3-998c-48aa-b158-037aa6ebf920","Type":"ContainerDied","Data":"e567f3fce58101f04d10f4c1b338d748357fcbabd5cb125a391016d6874da2d5"} Apr 16 16:48:10.043884 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:10.043852 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:48:10.058063 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:10.058016 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:48:10.891034 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:10.890869 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5rwjz"] Apr 16 16:48:10.891211 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:10.891113 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:48:10.891259 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:10.891209 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5rwjz" podUID="65f280f9-caf6-429e-ac03-31bd647a05b6" Apr 16 16:48:10.894785 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:10.894745 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ts7hl"] Apr 16 16:48:10.894927 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:10.894850 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:48:10.894973 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:10.894940 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ts7hl" podUID="018fd32e-3479-4227-9d81-8a232b27fc2b" Apr 16 16:48:10.895505 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:10.895475 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4bqrw"] Apr 16 16:48:10.895622 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:10.895608 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:48:10.895720 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:10.895695 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4bqrw" podUID="5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77" Apr 16 16:48:12.048431 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:12.048397 2568 generic.go:358] "Generic (PLEG): container finished" podID="482c17e3-998c-48aa-b158-037aa6ebf920" containerID="71eb36bbeee9e76477b9dbff044f681c4fcb5e5e1f4d46cdc658c574dd48af43" exitCode=0 Apr 16 16:48:12.048903 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:12.048492 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brchx" event={"ID":"482c17e3-998c-48aa-b158-037aa6ebf920","Type":"ContainerDied","Data":"71eb36bbeee9e76477b9dbff044f681c4fcb5e5e1f4d46cdc658c574dd48af43"} Apr 16 16:48:12.049069 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:12.049045 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-d4xtv" Apr 16 16:48:12.049200 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:12.049144 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 16:48:12.049662 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:12.049646 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-d4xtv" Apr 16 16:48:12.929941 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:12.929911 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:48:12.930073 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:12.930011 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:48:12.930073 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:12.930033 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:48:12.930187 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:12.930009 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ts7hl" podUID="018fd32e-3479-4227-9d81-8a232b27fc2b" Apr 16 16:48:12.930187 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:12.930092 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4bqrw" podUID="5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77" Apr 16 16:48:12.930187 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:12.930176 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5rwjz" podUID="65f280f9-caf6-429e-ac03-31bd647a05b6" Apr 16 16:48:13.052619 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:13.052541 2568 generic.go:358] "Generic (PLEG): container finished" podID="482c17e3-998c-48aa-b158-037aa6ebf920" containerID="b7eb945cc2bbe769443719f57277d1b5a41e58f02507b83de33406dd6dcb37da" exitCode=0 Apr 16 16:48:13.052619 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:13.052608 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brchx" event={"ID":"482c17e3-998c-48aa-b158-037aa6ebf920","Type":"ContainerDied","Data":"b7eb945cc2bbe769443719f57277d1b5a41e58f02507b83de33406dd6dcb37da"} Apr 16 16:48:14.929354 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:14.929268 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:48:14.929876 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:14.929268 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:48:14.929876 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:14.929401 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5rwjz" podUID="65f280f9-caf6-429e-ac03-31bd647a05b6" Apr 16 16:48:14.929876 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:14.929268 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:48:14.929876 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:14.929490 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-4bqrw" podUID="5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77" Apr 16 16:48:14.929876 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:14.929565 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ts7hl" podUID="018fd32e-3479-4227-9d81-8a232b27fc2b" Apr 16 16:48:16.591686 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.591599 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-63.ec2.internal" event="NodeReady" Apr 16 16:48:16.592119 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.591750 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 16:48:16.625751 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.625709 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66f6866d4-g7mhz"] Apr 16 16:48:16.657436 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.657396 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ndbsz"] Apr 16 16:48:16.657600 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.657572 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.660095 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.660072 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 16:48:16.660274 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.660102 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-zkp2p\"" Apr 16 16:48:16.660384 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.660368 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 16:48:16.660443 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.660409 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 16:48:16.672818 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.672794 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4nrrn"] Apr 16 16:48:16.672972 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.672956 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ndbsz" Apr 16 16:48:16.675784 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.675766 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 16:48:16.676083 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.676054 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 16:48:16.676083 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.676068 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-z7whv\"" Apr 16 16:48:16.685197 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.685177 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 16:48:16.697619 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.697589 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66f6866d4-g7mhz"] Apr 16 16:48:16.697619 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.697613 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4nrrn"] Apr 16 16:48:16.697619 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.697622 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ndbsz"] Apr 16 16:48:16.697847 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.697750 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4nrrn" Apr 16 16:48:16.700716 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.700695 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-s9vpx\"" Apr 16 16:48:16.700716 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.700707 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 16:48:16.700876 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.700696 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 16:48:16.700876 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.700730 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 16:48:16.779688 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.779658 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-ca-trust-extracted\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.779877 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.779700 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.779877 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.779739 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-bound-sa-token\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.779877 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.779838 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-image-registry-private-configuration\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.780018 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.779882 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-certificates\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.780018 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.779954 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-installation-pull-secrets\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.780101 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.780025 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-trusted-ca\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.780101 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.780055 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6t6q\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-kube-api-access-n6t6q\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.780185 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.780112 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-config-volume\") pod \"dns-default-ndbsz\" (UID: \"e9c730fc-1a8b-4c50-92f3-2ebfd693c270\") " pod="openshift-dns/dns-default-ndbsz" Apr 16 16:48:16.780185 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.780145 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls\") pod \"dns-default-ndbsz\" (UID: \"e9c730fc-1a8b-4c50-92f3-2ebfd693c270\") " pod="openshift-dns/dns-default-ndbsz" Apr 16 16:48:16.780185 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.780174 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-tmp-dir\") pod \"dns-default-ndbsz\" (UID: \"e9c730fc-1a8b-4c50-92f3-2ebfd693c270\") " pod="openshift-dns/dns-default-ndbsz" Apr 16 16:48:16.780304 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.780215 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkm79\" (UniqueName: \"kubernetes.io/projected/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-kube-api-access-jkm79\") pod \"dns-default-ndbsz\" (UID: \"e9c730fc-1a8b-4c50-92f3-2ebfd693c270\") " pod="openshift-dns/dns-default-ndbsz" Apr 16 16:48:16.880758 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.880673 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-bound-sa-token\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.880758 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.880734 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-image-registry-private-configuration\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.880957 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.880787 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcmpm\" (UniqueName: \"kubernetes.io/projected/af4910d1-f39b-44e2-805e-bcc17c0e30d0-kube-api-access-kcmpm\") pod \"ingress-canary-4nrrn\" (UID: \"af4910d1-f39b-44e2-805e-bcc17c0e30d0\") " pod="openshift-ingress-canary/ingress-canary-4nrrn" Apr 16 16:48:16.880957 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.880914 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-certificates\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.881044 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.880954 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-installation-pull-secrets\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.881126 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.881091 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-trusted-ca\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.881203 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.881139 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6t6q\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-kube-api-access-n6t6q\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.881203 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.881170 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-config-volume\") pod \"dns-default-ndbsz\" (UID: \"e9c730fc-1a8b-4c50-92f3-2ebfd693c270\") " pod="openshift-dns/dns-default-ndbsz" Apr 16 16:48:16.881203 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.881191 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls\") pod \"dns-default-ndbsz\" (UID: \"e9c730fc-1a8b-4c50-92f3-2ebfd693c270\") " pod="openshift-dns/dns-default-ndbsz" Apr 16 16:48:16.881342 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.881212 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-tmp-dir\") pod \"dns-default-ndbsz\" (UID: \"e9c730fc-1a8b-4c50-92f3-2ebfd693c270\") " pod="openshift-dns/dns-default-ndbsz" Apr 16 16:48:16.881342 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.881239 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert\") pod \"ingress-canary-4nrrn\" (UID: \"af4910d1-f39b-44e2-805e-bcc17c0e30d0\") " pod="openshift-ingress-canary/ingress-canary-4nrrn" Apr 16 16:48:16.881342 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.881264 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkm79\" (UniqueName: \"kubernetes.io/projected/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-kube-api-access-jkm79\") pod \"dns-default-ndbsz\" (UID: \"e9c730fc-1a8b-4c50-92f3-2ebfd693c270\") " pod="openshift-dns/dns-default-ndbsz" Apr 16 16:48:16.881342 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.881314 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-ca-trust-extracted\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.881551 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.881343 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.881551 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:16.881432 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:16.881551 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:16.881464 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:16.881551 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:16.881479 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66f6866d4-g7mhz: secret "image-registry-tls" not found Apr 16 16:48:16.881551 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:16.881533 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls podName:e9c730fc-1a8b-4c50-92f3-2ebfd693c270 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:17.381496696 +0000 UTC m=+34.051673469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls") pod "dns-default-ndbsz" (UID: "e9c730fc-1a8b-4c50-92f3-2ebfd693c270") : secret "dns-default-metrics-tls" not found Apr 16 16:48:16.881783 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:16.881556 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls podName:a06f9194-8eb8-465c-a248-cdeff2ea3ec9 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:17.381544341 +0000 UTC m=+34.051721106 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls") pod "image-registry-66f6866d4-g7mhz" (UID: "a06f9194-8eb8-465c-a248-cdeff2ea3ec9") : secret "image-registry-tls" not found Apr 16 16:48:16.881783 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.881580 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-certificates\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.881783 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.881766 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-tmp-dir\") pod \"dns-default-ndbsz\" (UID: \"e9c730fc-1a8b-4c50-92f3-2ebfd693c270\") " pod="openshift-dns/dns-default-ndbsz" Apr 16 16:48:16.881931 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.881846 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-ca-trust-extracted\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.881989 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.881952 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-config-volume\") pod \"dns-default-ndbsz\" (UID: \"e9c730fc-1a8b-4c50-92f3-2ebfd693c270\") " pod="openshift-dns/dns-default-ndbsz" Apr 16 16:48:16.884974 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.884857 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-installation-pull-secrets\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.885073 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.884876 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-image-registry-private-configuration\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.890258 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.890236 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-bound-sa-token\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.890425 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.890399 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6t6q\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-kube-api-access-n6t6q\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.890539 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.890507 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkm79\" (UniqueName: \"kubernetes.io/projected/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-kube-api-access-jkm79\") pod \"dns-default-ndbsz\" (UID: \"e9c730fc-1a8b-4c50-92f3-2ebfd693c270\") " pod="openshift-dns/dns-default-ndbsz" Apr 16 16:48:16.918504 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.918470 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-trusted-ca\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:16.929394 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.929369 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:48:16.929560 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.929367 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:48:16.929560 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.929367 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:48:16.932438 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.932398 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-km8f8\"" Apr 16 16:48:16.932438 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.932414 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:48:16.932634 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.932413 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:48:16.932634 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.932414 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-fz45n\"" Apr 16 16:48:16.932634 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.932398 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:48:16.932634 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.932414 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 16:48:16.982219 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.982188 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcmpm\" (UniqueName: \"kubernetes.io/projected/af4910d1-f39b-44e2-805e-bcc17c0e30d0-kube-api-access-kcmpm\") pod \"ingress-canary-4nrrn\" (UID: \"af4910d1-f39b-44e2-805e-bcc17c0e30d0\") " pod="openshift-ingress-canary/ingress-canary-4nrrn" Apr 16 16:48:16.982393 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.982263 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert\") pod \"ingress-canary-4nrrn\" (UID: \"af4910d1-f39b-44e2-805e-bcc17c0e30d0\") " pod="openshift-ingress-canary/ingress-canary-4nrrn" Apr 16 16:48:16.982393 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:16.982369 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:16.982477 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:16.982430 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert podName:af4910d1-f39b-44e2-805e-bcc17c0e30d0 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:17.482413021 +0000 UTC m=+34.152589797 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert") pod "ingress-canary-4nrrn" (UID: "af4910d1-f39b-44e2-805e-bcc17c0e30d0") : secret "canary-serving-cert" not found Apr 16 16:48:16.991149 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:16.991121 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcmpm\" (UniqueName: \"kubernetes.io/projected/af4910d1-f39b-44e2-805e-bcc17c0e30d0-kube-api-access-kcmpm\") pod \"ingress-canary-4nrrn\" (UID: \"af4910d1-f39b-44e2-805e-bcc17c0e30d0\") " pod="openshift-ingress-canary/ingress-canary-4nrrn" Apr 16 16:48:17.386120 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:17.386085 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:17.386315 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:17.386191 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls\") pod \"dns-default-ndbsz\" (UID: \"e9c730fc-1a8b-4c50-92f3-2ebfd693c270\") " pod="openshift-dns/dns-default-ndbsz" Apr 16 16:48:17.386315 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:17.386262 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:17.386315 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:17.386286 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66f6866d4-g7mhz: secret "image-registry-tls" not found Apr 16 16:48:17.386438 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:17.386318 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:17.386438 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:17.386351 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls podName:a06f9194-8eb8-465c-a248-cdeff2ea3ec9 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:18.386335237 +0000 UTC m=+35.056511994 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls") pod "image-registry-66f6866d4-g7mhz" (UID: "a06f9194-8eb8-465c-a248-cdeff2ea3ec9") : secret "image-registry-tls" not found Apr 16 16:48:17.386438 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:17.386370 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls podName:e9c730fc-1a8b-4c50-92f3-2ebfd693c270 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:18.386357812 +0000 UTC m=+35.056534569 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls") pod "dns-default-ndbsz" (UID: "e9c730fc-1a8b-4c50-92f3-2ebfd693c270") : secret "dns-default-metrics-tls" not found Apr 16 16:48:17.487588 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:17.487549 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert\") pod \"ingress-canary-4nrrn\" (UID: \"af4910d1-f39b-44e2-805e-bcc17c0e30d0\") " pod="openshift-ingress-canary/ingress-canary-4nrrn" Apr 16 16:48:17.487753 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:17.487714 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:17.487826 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:17.487801 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert podName:af4910d1-f39b-44e2-805e-bcc17c0e30d0 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:18.48777793 +0000 UTC m=+35.157954689 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert") pod "ingress-canary-4nrrn" (UID: "af4910d1-f39b-44e2-805e-bcc17c0e30d0") : secret "canary-serving-cert" not found Apr 16 16:48:17.588801 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:17.588763 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs\") pod \"network-metrics-daemon-5rwjz\" (UID: \"65f280f9-caf6-429e-ac03-31bd647a05b6\") " pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:48:17.588950 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:17.588890 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:48:17.588950 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:17.588949 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs podName:65f280f9-caf6-429e-ac03-31bd647a05b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:49.588935318 +0000 UTC m=+66.259112080 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs") pod "network-metrics-daemon-5rwjz" (UID: "65f280f9-caf6-429e-ac03-31bd647a05b6") : secret "metrics-daemon-secret" not found Apr 16 16:48:17.689766 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:17.689673 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2x5p\" (UniqueName: \"kubernetes.io/projected/018fd32e-3479-4227-9d81-8a232b27fc2b-kube-api-access-h2x5p\") pod \"network-check-target-ts7hl\" (UID: \"018fd32e-3479-4227-9d81-8a232b27fc2b\") " pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:48:17.692916 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:17.692887 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2x5p\" (UniqueName: \"kubernetes.io/projected/018fd32e-3479-4227-9d81-8a232b27fc2b-kube-api-access-h2x5p\") pod \"network-check-target-ts7hl\" (UID: \"018fd32e-3479-4227-9d81-8a232b27fc2b\") " pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:48:17.854536 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:17.854487 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:48:18.395713 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:18.395668 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls\") pod \"dns-default-ndbsz\" (UID: \"e9c730fc-1a8b-4c50-92f3-2ebfd693c270\") " pod="openshift-dns/dns-default-ndbsz" Apr 16 16:48:18.395910 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:18.395762 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:18.395910 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:18.395817 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:18.395910 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:18.395899 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls podName:e9c730fc-1a8b-4c50-92f3-2ebfd693c270 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:20.395878012 +0000 UTC m=+37.066054789 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls") pod "dns-default-ndbsz" (UID: "e9c730fc-1a8b-4c50-92f3-2ebfd693c270") : secret "dns-default-metrics-tls" not found Apr 16 16:48:18.396081 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:18.395909 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:18.396081 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:18.395929 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66f6866d4-g7mhz: secret "image-registry-tls" not found Apr 16 16:48:18.396081 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:18.395978 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls podName:a06f9194-8eb8-465c-a248-cdeff2ea3ec9 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:20.395962224 +0000 UTC m=+37.066139009 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls") pod "image-registry-66f6866d4-g7mhz" (UID: "a06f9194-8eb8-465c-a248-cdeff2ea3ec9") : secret "image-registry-tls" not found Apr 16 16:48:18.496363 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:18.496324 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert\") pod \"ingress-canary-4nrrn\" (UID: \"af4910d1-f39b-44e2-805e-bcc17c0e30d0\") " pod="openshift-ingress-canary/ingress-canary-4nrrn" Apr 16 16:48:18.496544 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:18.496504 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:18.496607 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:18.496595 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert podName:af4910d1-f39b-44e2-805e-bcc17c0e30d0 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:20.496574613 +0000 UTC m=+37.166751374 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert") pod "ingress-canary-4nrrn" (UID: "af4910d1-f39b-44e2-805e-bcc17c0e30d0") : secret "canary-serving-cert" not found Apr 16 16:48:19.403256 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:19.403216 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-original-pull-secret\") pod \"global-pull-secret-syncer-4bqrw\" (UID: \"5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77\") " pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:48:19.405717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:19.405694 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77-original-pull-secret\") pod \"global-pull-secret-syncer-4bqrw\" (UID: \"5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77\") " pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:48:19.619763 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:19.619697 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ts7hl"] Apr 16 16:48:19.623219 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:48:19.623184 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod018fd32e_3479_4227_9d81_8a232b27fc2b.slice/crio-667040b855a42d975da3fece527debc04d610dccfa11369699fdd76808a68379 WatchSource:0}: Error finding container 667040b855a42d975da3fece527debc04d610dccfa11369699fdd76808a68379: Status 404 returned error can't find the container with id 667040b855a42d975da3fece527debc04d610dccfa11369699fdd76808a68379 Apr 16 16:48:19.648284 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:19.648258 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4bqrw" Apr 16 16:48:19.965582 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:19.965555 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4bqrw"] Apr 16 16:48:19.975631 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:48:19.975607 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ae3c2f7_2de4_4c3c_9cf8_bc6213e92f77.slice/crio-0edb13a95c952ba790302b0f851604deaef8a3f9612ccc42a142526553c55298 WatchSource:0}: Error finding container 0edb13a95c952ba790302b0f851604deaef8a3f9612ccc42a142526553c55298: Status 404 returned error can't find the container with id 0edb13a95c952ba790302b0f851604deaef8a3f9612ccc42a142526553c55298 Apr 16 16:48:20.068664 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:20.068475 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ts7hl" event={"ID":"018fd32e-3479-4227-9d81-8a232b27fc2b","Type":"ContainerStarted","Data":"667040b855a42d975da3fece527debc04d610dccfa11369699fdd76808a68379"} Apr 16 16:48:20.071083 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:20.071056 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brchx" event={"ID":"482c17e3-998c-48aa-b158-037aa6ebf920","Type":"ContainerStarted","Data":"e19712db23180efbdb9bbbfdfef263908234ddd05b06dc2140d1e9faf570eaeb"} Apr 16 16:48:20.072205 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:20.072181 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4bqrw" event={"ID":"5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77","Type":"ContainerStarted","Data":"0edb13a95c952ba790302b0f851604deaef8a3f9612ccc42a142526553c55298"} Apr 16 16:48:20.413001 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:20.412904 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:20.413399 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:20.413013 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls\") pod \"dns-default-ndbsz\" (UID: \"e9c730fc-1a8b-4c50-92f3-2ebfd693c270\") " pod="openshift-dns/dns-default-ndbsz" Apr 16 16:48:20.413399 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:20.413074 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:20.413399 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:20.413096 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66f6866d4-g7mhz: secret "image-registry-tls" not found Apr 16 16:48:20.413399 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:20.413147 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:20.413399 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:20.413165 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls podName:a06f9194-8eb8-465c-a248-cdeff2ea3ec9 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:24.413144557 +0000 UTC m=+41.083321330 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls") pod "image-registry-66f6866d4-g7mhz" (UID: "a06f9194-8eb8-465c-a248-cdeff2ea3ec9") : secret "image-registry-tls" not found Apr 16 16:48:20.413399 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:20.413202 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls podName:e9c730fc-1a8b-4c50-92f3-2ebfd693c270 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:24.413184867 +0000 UTC m=+41.083361626 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls") pod "dns-default-ndbsz" (UID: "e9c730fc-1a8b-4c50-92f3-2ebfd693c270") : secret "dns-default-metrics-tls" not found Apr 16 16:48:20.513721 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:20.513683 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert\") pod \"ingress-canary-4nrrn\" (UID: \"af4910d1-f39b-44e2-805e-bcc17c0e30d0\") " pod="openshift-ingress-canary/ingress-canary-4nrrn" Apr 16 16:48:20.513889 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:20.513843 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:20.513965 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:20.513905 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert podName:af4910d1-f39b-44e2-805e-bcc17c0e30d0 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:24.513891128 +0000 UTC m=+41.184067889 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert") pod "ingress-canary-4nrrn" (UID: "af4910d1-f39b-44e2-805e-bcc17c0e30d0") : secret "canary-serving-cert" not found Apr 16 16:48:21.077433 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:21.077364 2568 generic.go:358] "Generic (PLEG): container finished" podID="482c17e3-998c-48aa-b158-037aa6ebf920" containerID="e19712db23180efbdb9bbbfdfef263908234ddd05b06dc2140d1e9faf570eaeb" exitCode=0 Apr 16 16:48:21.077643 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:21.077451 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brchx" event={"ID":"482c17e3-998c-48aa-b158-037aa6ebf920","Type":"ContainerDied","Data":"e19712db23180efbdb9bbbfdfef263908234ddd05b06dc2140d1e9faf570eaeb"} Apr 16 16:48:22.082813 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:22.082709 2568 generic.go:358] "Generic (PLEG): container finished" podID="482c17e3-998c-48aa-b158-037aa6ebf920" containerID="58c33665d445f16eaee56e8834278808c65b5dca6e4b95d78f41d2edc8cd69c9" exitCode=0 Apr 16 16:48:22.082813 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:22.082772 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brchx" event={"ID":"482c17e3-998c-48aa-b158-037aa6ebf920","Type":"ContainerDied","Data":"58c33665d445f16eaee56e8834278808c65b5dca6e4b95d78f41d2edc8cd69c9"} Apr 16 16:48:24.448164 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:24.448082 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls\") pod \"dns-default-ndbsz\" (UID: \"e9c730fc-1a8b-4c50-92f3-2ebfd693c270\") " pod="openshift-dns/dns-default-ndbsz" Apr 16 16:48:24.448164 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:24.448141 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:24.448664 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:24.448236 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:24.448664 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:24.448312 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls podName:e9c730fc-1a8b-4c50-92f3-2ebfd693c270 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:32.44829712 +0000 UTC m=+49.118473877 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls") pod "dns-default-ndbsz" (UID: "e9c730fc-1a8b-4c50-92f3-2ebfd693c270") : secret "dns-default-metrics-tls" not found Apr 16 16:48:24.448664 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:24.448246 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:24.448664 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:24.448346 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66f6866d4-g7mhz: secret "image-registry-tls" not found Apr 16 16:48:24.448664 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:24.448384 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls podName:a06f9194-8eb8-465c-a248-cdeff2ea3ec9 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:32.448373158 +0000 UTC m=+49.118549914 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls") pod "image-registry-66f6866d4-g7mhz" (UID: "a06f9194-8eb8-465c-a248-cdeff2ea3ec9") : secret "image-registry-tls" not found Apr 16 16:48:24.548599 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:24.548570 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert\") pod \"ingress-canary-4nrrn\" (UID: \"af4910d1-f39b-44e2-805e-bcc17c0e30d0\") " pod="openshift-ingress-canary/ingress-canary-4nrrn" Apr 16 16:48:24.548790 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:24.548770 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:24.548852 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:24.548842 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert podName:af4910d1-f39b-44e2-805e-bcc17c0e30d0 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:32.548820943 +0000 UTC m=+49.218997704 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert") pod "ingress-canary-4nrrn" (UID: "af4910d1-f39b-44e2-805e-bcc17c0e30d0") : secret "canary-serving-cert" not found Apr 16 16:48:25.089951 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:25.089910 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ts7hl" event={"ID":"018fd32e-3479-4227-9d81-8a232b27fc2b","Type":"ContainerStarted","Data":"ca649b8b63279a9fd3f25ef02bbc709a4f747a29a1c04ff92d1275c62021e613"} Apr 16 16:48:25.090122 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:25.090022 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:48:25.092747 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:25.092726 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-brchx" event={"ID":"482c17e3-998c-48aa-b158-037aa6ebf920","Type":"ContainerStarted","Data":"afdd510fb73e48d5ad1c6857c6577753faaeccc1ba30c5c00f745602a6a7d1fa"} Apr 16 16:48:25.094051 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:25.094028 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4bqrw" event={"ID":"5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77","Type":"ContainerStarted","Data":"7a8809d3fb56322af2fbd496b15cd509791d346a5ef546612efdf4ec5d818a10"} Apr 16 16:48:25.109406 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:25.109365 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ts7hl" podStartSLOduration=36.194386916 podStartE2EDuration="41.109354915s" podCreationTimestamp="2026-04-16 16:47:44 +0000 UTC" firstStartedPulling="2026-04-16 16:48:19.625200391 +0000 UTC m=+36.295377151" lastFinishedPulling="2026-04-16 16:48:24.540168377 +0000 UTC m=+41.210345150" observedRunningTime="2026-04-16 16:48:25.108570769 +0000 UTC m=+41.778747593" watchObservedRunningTime="2026-04-16 16:48:25.109354915 +0000 UTC m=+41.779531685" Apr 16 16:48:25.132075 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:25.132027 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-brchx" podStartSLOduration=7.967006014 podStartE2EDuration="41.132013044s" podCreationTimestamp="2026-04-16 16:47:44 +0000 UTC" firstStartedPulling="2026-04-16 16:47:46.68171466 +0000 UTC m=+3.351891420" lastFinishedPulling="2026-04-16 16:48:19.846721693 +0000 UTC m=+36.516898450" observedRunningTime="2026-04-16 16:48:25.130166189 +0000 UTC m=+41.800342979" watchObservedRunningTime="2026-04-16 16:48:25.132013044 +0000 UTC m=+41.802189835" Apr 16 16:48:25.146045 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:25.145996 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-4bqrw" podStartSLOduration=33.571194596 podStartE2EDuration="38.145982485s" podCreationTimestamp="2026-04-16 16:47:47 +0000 UTC" firstStartedPulling="2026-04-16 16:48:19.977210297 +0000 UTC m=+36.647387058" lastFinishedPulling="2026-04-16 16:48:24.551998186 +0000 UTC m=+41.222174947" observedRunningTime="2026-04-16 16:48:25.145273994 +0000 UTC m=+41.815450773" watchObservedRunningTime="2026-04-16 16:48:25.145982485 +0000 UTC m=+41.816159319" Apr 16 16:48:32.506397 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:32.506352 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:32.506837 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:32.506428 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls\") pod \"dns-default-ndbsz\" (UID: \"e9c730fc-1a8b-4c50-92f3-2ebfd693c270\") " pod="openshift-dns/dns-default-ndbsz" Apr 16 16:48:32.506837 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:32.506513 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:32.506837 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:32.506543 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:32.506837 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:32.506551 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66f6866d4-g7mhz: secret "image-registry-tls" not found Apr 16 16:48:32.506837 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:32.506607 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls podName:e9c730fc-1a8b-4c50-92f3-2ebfd693c270 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:48.506594312 +0000 UTC m=+65.176771069 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls") pod "dns-default-ndbsz" (UID: "e9c730fc-1a8b-4c50-92f3-2ebfd693c270") : secret "dns-default-metrics-tls" not found Apr 16 16:48:32.506837 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:32.506621 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls podName:a06f9194-8eb8-465c-a248-cdeff2ea3ec9 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:48.506614893 +0000 UTC m=+65.176791650 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls") pod "image-registry-66f6866d4-g7mhz" (UID: "a06f9194-8eb8-465c-a248-cdeff2ea3ec9") : secret "image-registry-tls" not found Apr 16 16:48:32.607057 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:32.607014 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert\") pod \"ingress-canary-4nrrn\" (UID: \"af4910d1-f39b-44e2-805e-bcc17c0e30d0\") " pod="openshift-ingress-canary/ingress-canary-4nrrn" Apr 16 16:48:32.607223 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:32.607153 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:32.607223 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:32.607215 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert podName:af4910d1-f39b-44e2-805e-bcc17c0e30d0 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:48.607200429 +0000 UTC m=+65.277377186 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert") pod "ingress-canary-4nrrn" (UID: "af4910d1-f39b-44e2-805e-bcc17c0e30d0") : secret "canary-serving-cert" not found Apr 16 16:48:42.064610 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:42.064579 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s2x9b" Apr 16 16:48:48.510215 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:48.510170 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls\") pod \"dns-default-ndbsz\" (UID: \"e9c730fc-1a8b-4c50-92f3-2ebfd693c270\") " pod="openshift-dns/dns-default-ndbsz" Apr 16 16:48:48.510686 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:48.510228 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:48:48.510686 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:48.510328 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:48.510686 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:48.510398 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls podName:e9c730fc-1a8b-4c50-92f3-2ebfd693c270 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:20.510380743 +0000 UTC m=+97.180557500 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls") pod "dns-default-ndbsz" (UID: "e9c730fc-1a8b-4c50-92f3-2ebfd693c270") : secret "dns-default-metrics-tls" not found Apr 16 16:48:48.510686 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:48.510337 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:48:48.510686 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:48.510435 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66f6866d4-g7mhz: secret "image-registry-tls" not found Apr 16 16:48:48.510686 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:48.510487 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls podName:a06f9194-8eb8-465c-a248-cdeff2ea3ec9 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:20.510476254 +0000 UTC m=+97.180653010 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls") pod "image-registry-66f6866d4-g7mhz" (UID: "a06f9194-8eb8-465c-a248-cdeff2ea3ec9") : secret "image-registry-tls" not found Apr 16 16:48:48.611109 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:48.611062 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert\") pod \"ingress-canary-4nrrn\" (UID: \"af4910d1-f39b-44e2-805e-bcc17c0e30d0\") " pod="openshift-ingress-canary/ingress-canary-4nrrn" Apr 16 16:48:48.611280 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:48.611205 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:48.611280 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:48.611265 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert podName:af4910d1-f39b-44e2-805e-bcc17c0e30d0 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:20.611250877 +0000 UTC m=+97.281427634 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert") pod "ingress-canary-4nrrn" (UID: "af4910d1-f39b-44e2-805e-bcc17c0e30d0") : secret "canary-serving-cert" not found Apr 16 16:48:49.617821 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:49.617772 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs\") pod \"network-metrics-daemon-5rwjz\" (UID: \"65f280f9-caf6-429e-ac03-31bd647a05b6\") " pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:48:49.618226 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:49.617925 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:48:49.618226 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:48:49.617991 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs podName:65f280f9-caf6-429e-ac03-31bd647a05b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:53.617976406 +0000 UTC m=+130.288153163 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs") pod "network-metrics-daemon-5rwjz" (UID: "65f280f9-caf6-429e-ac03-31bd647a05b6") : secret "metrics-daemon-secret" not found Apr 16 16:48:56.098905 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:48:56.098873 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ts7hl" Apr 16 16:49:20.541172 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:20.541114 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:49:20.541596 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:49:20.541270 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:49:20.541596 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:49:20.541294 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66f6866d4-g7mhz: secret "image-registry-tls" not found Apr 16 16:49:20.541596 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:20.541293 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls\") pod \"dns-default-ndbsz\" (UID: \"e9c730fc-1a8b-4c50-92f3-2ebfd693c270\") " pod="openshift-dns/dns-default-ndbsz" Apr 16 16:49:20.541596 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:49:20.541342 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls podName:a06f9194-8eb8-465c-a248-cdeff2ea3ec9 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:24.541327149 +0000 UTC m=+161.211503906 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls") pod "image-registry-66f6866d4-g7mhz" (UID: "a06f9194-8eb8-465c-a248-cdeff2ea3ec9") : secret "image-registry-tls" not found Apr 16 16:49:20.541596 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:49:20.541387 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:49:20.541596 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:49:20.541437 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls podName:e9c730fc-1a8b-4c50-92f3-2ebfd693c270 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:24.541423993 +0000 UTC m=+161.211600751 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls") pod "dns-default-ndbsz" (UID: "e9c730fc-1a8b-4c50-92f3-2ebfd693c270") : secret "dns-default-metrics-tls" not found Apr 16 16:49:20.641886 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:20.641843 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert\") pod \"ingress-canary-4nrrn\" (UID: \"af4910d1-f39b-44e2-805e-bcc17c0e30d0\") " pod="openshift-ingress-canary/ingress-canary-4nrrn" Apr 16 16:49:20.642078 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:49:20.642007 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:49:20.642123 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:49:20.642085 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert podName:af4910d1-f39b-44e2-805e-bcc17c0e30d0 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:24.642069305 +0000 UTC m=+161.312246062 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert") pod "ingress-canary-4nrrn" (UID: "af4910d1-f39b-44e2-805e-bcc17c0e30d0") : secret "canary-serving-cert" not found Apr 16 16:49:53.672052 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:53.672015 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs\") pod \"network-metrics-daemon-5rwjz\" (UID: \"65f280f9-caf6-429e-ac03-31bd647a05b6\") " pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:49:53.672687 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:49:53.672186 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:49:53.672687 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:49:53.672276 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs podName:65f280f9-caf6-429e-ac03-31bd647a05b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:51:55.672253326 +0000 UTC m=+252.342430123 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs") pod "network-metrics-daemon-5rwjz" (UID: "65f280f9-caf6-429e-ac03-31bd647a05b6") : secret "metrics-daemon-secret" not found Apr 16 16:49:57.979175 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:57.979144 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-zn9sv"] Apr 16 16:49:57.981827 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:57.981811 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89"] Apr 16 16:49:57.981981 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:57.981961 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-zn9sv" Apr 16 16:49:57.984267 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:57.984249 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-zn9sv"] Apr 16 16:49:57.984367 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:57.984337 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89" Apr 16 16:49:57.984912 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:57.984883 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 16:49:57.985023 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:57.984912 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-b8f27\"" Apr 16 16:49:57.986142 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:57.986120 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:49:57.986142 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:57.986137 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 16:49:57.986985 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:57.986967 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 16:49:57.986985 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:57.986980 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 16:49:57.987419 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:57.987401 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-8rrpg\"" Apr 16 16:49:57.988875 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:57.988857 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 16:49:57.989002 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:57.988860 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 16:49:57.992008 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:57.991986 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89"] Apr 16 16:49:58.105474 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:58.105438 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/073fb8a8-affb-434d-9a78-e8ef0444fc11-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-xwh89\" (UID: \"073fb8a8-affb-434d-9a78-e8ef0444fc11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89" Apr 16 16:49:58.105669 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:58.105479 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwhmc\" (UniqueName: \"kubernetes.io/projected/5d7eab9c-0611-4981-95ed-6811b8ca42d6-kube-api-access-cwhmc\") pod \"cluster-samples-operator-667775844f-zn9sv\" (UID: \"5d7eab9c-0611-4981-95ed-6811b8ca42d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-zn9sv" Apr 16 16:49:58.105669 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:58.105597 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d7eab9c-0611-4981-95ed-6811b8ca42d6-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-zn9sv\" (UID: \"5d7eab9c-0611-4981-95ed-6811b8ca42d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-zn9sv" Apr 16 16:49:58.105669 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:58.105634 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/073fb8a8-affb-434d-9a78-e8ef0444fc11-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-xwh89\" (UID: \"073fb8a8-affb-434d-9a78-e8ef0444fc11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89" Apr 16 16:49:58.105669 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:58.105655 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgbg7\" (UniqueName: \"kubernetes.io/projected/073fb8a8-affb-434d-9a78-e8ef0444fc11-kube-api-access-kgbg7\") pod \"cluster-monitoring-operator-6667474d89-xwh89\" (UID: \"073fb8a8-affb-434d-9a78-e8ef0444fc11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89" Apr 16 16:49:58.206846 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:58.206797 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgbg7\" (UniqueName: \"kubernetes.io/projected/073fb8a8-affb-434d-9a78-e8ef0444fc11-kube-api-access-kgbg7\") pod \"cluster-monitoring-operator-6667474d89-xwh89\" (UID: \"073fb8a8-affb-434d-9a78-e8ef0444fc11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89" Apr 16 16:49:58.206977 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:58.206914 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/073fb8a8-affb-434d-9a78-e8ef0444fc11-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-xwh89\" (UID: \"073fb8a8-affb-434d-9a78-e8ef0444fc11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89" Apr 16 16:49:58.206977 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:58.206951 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwhmc\" (UniqueName: \"kubernetes.io/projected/5d7eab9c-0611-4981-95ed-6811b8ca42d6-kube-api-access-cwhmc\") pod \"cluster-samples-operator-667775844f-zn9sv\" (UID: \"5d7eab9c-0611-4981-95ed-6811b8ca42d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-zn9sv" Apr 16 16:49:58.207043 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:58.206999 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d7eab9c-0611-4981-95ed-6811b8ca42d6-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-zn9sv\" (UID: \"5d7eab9c-0611-4981-95ed-6811b8ca42d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-zn9sv" Apr 16 16:49:58.207043 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:58.207028 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/073fb8a8-affb-434d-9a78-e8ef0444fc11-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-xwh89\" (UID: \"073fb8a8-affb-434d-9a78-e8ef0444fc11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89" Apr 16 16:49:58.207139 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:49:58.207045 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:49:58.207139 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:49:58.207126 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/073fb8a8-affb-434d-9a78-e8ef0444fc11-cluster-monitoring-operator-tls podName:073fb8a8-affb-434d-9a78-e8ef0444fc11 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:58.707106737 +0000 UTC m=+135.377283497 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/073fb8a8-affb-434d-9a78-e8ef0444fc11-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-xwh89" (UID: "073fb8a8-affb-434d-9a78-e8ef0444fc11") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:49:58.207270 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:49:58.207153 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:49:58.207270 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:49:58.207228 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d7eab9c-0611-4981-95ed-6811b8ca42d6-samples-operator-tls podName:5d7eab9c-0611-4981-95ed-6811b8ca42d6 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:58.707209216 +0000 UTC m=+135.377385989 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5d7eab9c-0611-4981-95ed-6811b8ca42d6-samples-operator-tls") pod "cluster-samples-operator-667775844f-zn9sv" (UID: "5d7eab9c-0611-4981-95ed-6811b8ca42d6") : secret "samples-operator-tls" not found Apr 16 16:49:58.207730 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:58.207711 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/073fb8a8-affb-434d-9a78-e8ef0444fc11-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-xwh89\" (UID: \"073fb8a8-affb-434d-9a78-e8ef0444fc11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89" Apr 16 16:49:58.217266 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:58.217241 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwhmc\" (UniqueName: \"kubernetes.io/projected/5d7eab9c-0611-4981-95ed-6811b8ca42d6-kube-api-access-cwhmc\") pod \"cluster-samples-operator-667775844f-zn9sv\" (UID: \"5d7eab9c-0611-4981-95ed-6811b8ca42d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-zn9sv" Apr 16 16:49:58.217383 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:58.217288 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgbg7\" (UniqueName: \"kubernetes.io/projected/073fb8a8-affb-434d-9a78-e8ef0444fc11-kube-api-access-kgbg7\") pod \"cluster-monitoring-operator-6667474d89-xwh89\" (UID: \"073fb8a8-affb-434d-9a78-e8ef0444fc11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89" Apr 16 16:49:58.711562 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:58.711488 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/073fb8a8-affb-434d-9a78-e8ef0444fc11-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-xwh89\" (UID: \"073fb8a8-affb-434d-9a78-e8ef0444fc11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89" Apr 16 16:49:58.711752 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:58.711605 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d7eab9c-0611-4981-95ed-6811b8ca42d6-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-zn9sv\" (UID: \"5d7eab9c-0611-4981-95ed-6811b8ca42d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-zn9sv" Apr 16 16:49:58.711752 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:49:58.711635 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:49:58.711752 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:49:58.711701 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/073fb8a8-affb-434d-9a78-e8ef0444fc11-cluster-monitoring-operator-tls podName:073fb8a8-affb-434d-9a78-e8ef0444fc11 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:59.711685296 +0000 UTC m=+136.381862053 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/073fb8a8-affb-434d-9a78-e8ef0444fc11-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-xwh89" (UID: "073fb8a8-affb-434d-9a78-e8ef0444fc11") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:49:58.711752 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:49:58.711713 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:49:58.711886 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:49:58.711768 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d7eab9c-0611-4981-95ed-6811b8ca42d6-samples-operator-tls podName:5d7eab9c-0611-4981-95ed-6811b8ca42d6 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:59.71175178 +0000 UTC m=+136.381928538 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5d7eab9c-0611-4981-95ed-6811b8ca42d6-samples-operator-tls") pod "cluster-samples-operator-667775844f-zn9sv" (UID: "5d7eab9c-0611-4981-95ed-6811b8ca42d6") : secret "samples-operator-tls" not found Apr 16 16:49:59.719675 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:59.719633 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d7eab9c-0611-4981-95ed-6811b8ca42d6-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-zn9sv\" (UID: \"5d7eab9c-0611-4981-95ed-6811b8ca42d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-zn9sv" Apr 16 16:49:59.720201 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:59.719746 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/073fb8a8-affb-434d-9a78-e8ef0444fc11-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-xwh89\" (UID: \"073fb8a8-affb-434d-9a78-e8ef0444fc11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89" Apr 16 16:49:59.720201 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:49:59.719783 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:49:59.720201 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:49:59.719862 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:49:59.720201 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:49:59.719866 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d7eab9c-0611-4981-95ed-6811b8ca42d6-samples-operator-tls podName:5d7eab9c-0611-4981-95ed-6811b8ca42d6 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:01.719849833 +0000 UTC m=+138.390026590 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5d7eab9c-0611-4981-95ed-6811b8ca42d6-samples-operator-tls") pod "cluster-samples-operator-667775844f-zn9sv" (UID: "5d7eab9c-0611-4981-95ed-6811b8ca42d6") : secret "samples-operator-tls" not found Apr 16 16:49:59.720201 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:49:59.719939 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/073fb8a8-affb-434d-9a78-e8ef0444fc11-cluster-monitoring-operator-tls podName:073fb8a8-affb-434d-9a78-e8ef0444fc11 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:01.719920872 +0000 UTC m=+138.390097648 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/073fb8a8-affb-434d-9a78-e8ef0444fc11-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-xwh89" (UID: "073fb8a8-affb-434d-9a78-e8ef0444fc11") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:49:59.878418 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:59.878382 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-p8zc8"] Apr 16 16:49:59.881625 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:59.881604 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p8zc8" Apr 16 16:49:59.884426 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:59.884400 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 16:49:59.884426 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:59.884420 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 16:49:59.885685 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:59.885665 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 16:49:59.885815 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:59.885701 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:49:59.885815 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:59.885726 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-bdf4c\"" Apr 16 16:49:59.888594 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:49:59.888573 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-p8zc8"] Apr 16 16:50:00.022355 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:00.022321 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7ed2b8a-4851-4792-868d-23a18751df58-config\") pod \"service-ca-operator-69965bb79d-p8zc8\" (UID: \"a7ed2b8a-4851-4792-868d-23a18751df58\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p8zc8" Apr 16 16:50:00.022557 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:00.022369 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7ed2b8a-4851-4792-868d-23a18751df58-serving-cert\") pod \"service-ca-operator-69965bb79d-p8zc8\" (UID: \"a7ed2b8a-4851-4792-868d-23a18751df58\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p8zc8" Apr 16 16:50:00.022557 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:00.022479 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ql9w\" (UniqueName: \"kubernetes.io/projected/a7ed2b8a-4851-4792-868d-23a18751df58-kube-api-access-5ql9w\") pod \"service-ca-operator-69965bb79d-p8zc8\" (UID: \"a7ed2b8a-4851-4792-868d-23a18751df58\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p8zc8" Apr 16 16:50:00.123154 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:00.123114 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ql9w\" (UniqueName: \"kubernetes.io/projected/a7ed2b8a-4851-4792-868d-23a18751df58-kube-api-access-5ql9w\") pod \"service-ca-operator-69965bb79d-p8zc8\" (UID: \"a7ed2b8a-4851-4792-868d-23a18751df58\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p8zc8" Apr 16 16:50:00.123350 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:00.123231 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7ed2b8a-4851-4792-868d-23a18751df58-config\") pod \"service-ca-operator-69965bb79d-p8zc8\" (UID: \"a7ed2b8a-4851-4792-868d-23a18751df58\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p8zc8" Apr 16 16:50:00.123350 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:00.123266 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7ed2b8a-4851-4792-868d-23a18751df58-serving-cert\") pod \"service-ca-operator-69965bb79d-p8zc8\" (UID: \"a7ed2b8a-4851-4792-868d-23a18751df58\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p8zc8" Apr 16 16:50:00.123795 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:00.123770 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7ed2b8a-4851-4792-868d-23a18751df58-config\") pod \"service-ca-operator-69965bb79d-p8zc8\" (UID: \"a7ed2b8a-4851-4792-868d-23a18751df58\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p8zc8" Apr 16 16:50:00.125480 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:00.125458 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7ed2b8a-4851-4792-868d-23a18751df58-serving-cert\") pod \"service-ca-operator-69965bb79d-p8zc8\" (UID: \"a7ed2b8a-4851-4792-868d-23a18751df58\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p8zc8" Apr 16 16:50:00.131242 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:00.131213 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ql9w\" (UniqueName: \"kubernetes.io/projected/a7ed2b8a-4851-4792-868d-23a18751df58-kube-api-access-5ql9w\") pod \"service-ca-operator-69965bb79d-p8zc8\" (UID: \"a7ed2b8a-4851-4792-868d-23a18751df58\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p8zc8" Apr 16 16:50:00.191254 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:00.191199 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p8zc8" Apr 16 16:50:00.305692 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:00.305620 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-p8zc8"] Apr 16 16:50:00.308776 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:50:00.308743 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7ed2b8a_4851_4792_868d_23a18751df58.slice/crio-09cce179fba20f48f36a7f90cdcd12899ffd14a9000fc1f8ff7bcd242dbbb8fc WatchSource:0}: Error finding container 09cce179fba20f48f36a7f90cdcd12899ffd14a9000fc1f8ff7bcd242dbbb8fc: Status 404 returned error can't find the container with id 09cce179fba20f48f36a7f90cdcd12899ffd14a9000fc1f8ff7bcd242dbbb8fc Apr 16 16:50:01.285204 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:01.285171 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p8zc8" event={"ID":"a7ed2b8a-4851-4792-868d-23a18751df58","Type":"ContainerStarted","Data":"09cce179fba20f48f36a7f90cdcd12899ffd14a9000fc1f8ff7bcd242dbbb8fc"} Apr 16 16:50:01.735788 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:01.735691 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/073fb8a8-affb-434d-9a78-e8ef0444fc11-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-xwh89\" (UID: \"073fb8a8-affb-434d-9a78-e8ef0444fc11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89" Apr 16 16:50:01.735788 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:01.735757 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d7eab9c-0611-4981-95ed-6811b8ca42d6-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-zn9sv\" (UID: \"5d7eab9c-0611-4981-95ed-6811b8ca42d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-zn9sv" Apr 16 16:50:01.736007 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:50:01.735872 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:50:01.736007 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:50:01.735878 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:50:01.736007 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:50:01.735919 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d7eab9c-0611-4981-95ed-6811b8ca42d6-samples-operator-tls podName:5d7eab9c-0611-4981-95ed-6811b8ca42d6 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:05.735905759 +0000 UTC m=+142.406082516 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5d7eab9c-0611-4981-95ed-6811b8ca42d6-samples-operator-tls") pod "cluster-samples-operator-667775844f-zn9sv" (UID: "5d7eab9c-0611-4981-95ed-6811b8ca42d6") : secret "samples-operator-tls" not found Apr 16 16:50:01.736007 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:50:01.735943 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/073fb8a8-affb-434d-9a78-e8ef0444fc11-cluster-monitoring-operator-tls podName:073fb8a8-affb-434d-9a78-e8ef0444fc11 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:05.735925631 +0000 UTC m=+142.406102408 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/073fb8a8-affb-434d-9a78-e8ef0444fc11-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-xwh89" (UID: "073fb8a8-affb-434d-9a78-e8ef0444fc11") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:50:02.288338 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:02.288304 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p8zc8" event={"ID":"a7ed2b8a-4851-4792-868d-23a18751df58","Type":"ContainerStarted","Data":"daa5ab1cced67ab811b46ffd8aab939bbe48a31277d4f8f97837d287f465e941"} Apr 16 16:50:02.305491 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:02.305443 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p8zc8" podStartSLOduration=1.8573283950000001 podStartE2EDuration="3.305427356s" podCreationTimestamp="2026-04-16 16:49:59 +0000 UTC" firstStartedPulling="2026-04-16 16:50:00.311150793 +0000 UTC m=+136.981327550" lastFinishedPulling="2026-04-16 16:50:01.759249751 +0000 UTC m=+138.429426511" observedRunningTime="2026-04-16 16:50:02.303834322 +0000 UTC m=+138.974011101" watchObservedRunningTime="2026-04-16 16:50:02.305427356 +0000 UTC m=+138.975604135" Apr 16 16:50:04.963916 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:04.963883 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ptmqw_0c447c29-c470-4314-becc-ad24580321c8/dns-node-resolver/0.log" Apr 16 16:50:05.518148 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.518118 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-stqj4"] Apr 16 16:50:05.521049 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.521033 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-stqj4" Apr 16 16:50:05.523854 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.523822 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-6qv7h\"" Apr 16 16:50:05.524041 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.524027 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 16:50:05.525217 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.525195 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 16:50:05.525321 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.525220 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 16:50:05.525321 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.525196 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 16:50:05.532823 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.532803 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-stqj4"] Apr 16 16:50:05.566226 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.566196 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e23d01eb-8ff6-40f3-a05d-347fdc6d12b3-signing-key\") pod \"service-ca-bfc587fb7-stqj4\" (UID: \"e23d01eb-8ff6-40f3-a05d-347fdc6d12b3\") " pod="openshift-service-ca/service-ca-bfc587fb7-stqj4" Apr 16 16:50:05.566384 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.566236 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e23d01eb-8ff6-40f3-a05d-347fdc6d12b3-signing-cabundle\") pod \"service-ca-bfc587fb7-stqj4\" (UID: \"e23d01eb-8ff6-40f3-a05d-347fdc6d12b3\") " pod="openshift-service-ca/service-ca-bfc587fb7-stqj4" Apr 16 16:50:05.566384 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.566271 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k4vk\" (UniqueName: \"kubernetes.io/projected/e23d01eb-8ff6-40f3-a05d-347fdc6d12b3-kube-api-access-5k4vk\") pod \"service-ca-bfc587fb7-stqj4\" (UID: \"e23d01eb-8ff6-40f3-a05d-347fdc6d12b3\") " pod="openshift-service-ca/service-ca-bfc587fb7-stqj4" Apr 16 16:50:05.667009 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.666977 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e23d01eb-8ff6-40f3-a05d-347fdc6d12b3-signing-key\") pod \"service-ca-bfc587fb7-stqj4\" (UID: \"e23d01eb-8ff6-40f3-a05d-347fdc6d12b3\") " pod="openshift-service-ca/service-ca-bfc587fb7-stqj4" Apr 16 16:50:05.667140 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.667022 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e23d01eb-8ff6-40f3-a05d-347fdc6d12b3-signing-cabundle\") pod \"service-ca-bfc587fb7-stqj4\" (UID: \"e23d01eb-8ff6-40f3-a05d-347fdc6d12b3\") " pod="openshift-service-ca/service-ca-bfc587fb7-stqj4" Apr 16 16:50:05.667140 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.667068 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5k4vk\" (UniqueName: \"kubernetes.io/projected/e23d01eb-8ff6-40f3-a05d-347fdc6d12b3-kube-api-access-5k4vk\") pod \"service-ca-bfc587fb7-stqj4\" (UID: \"e23d01eb-8ff6-40f3-a05d-347fdc6d12b3\") " pod="openshift-service-ca/service-ca-bfc587fb7-stqj4" Apr 16 16:50:05.667851 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.667816 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e23d01eb-8ff6-40f3-a05d-347fdc6d12b3-signing-cabundle\") pod \"service-ca-bfc587fb7-stqj4\" (UID: \"e23d01eb-8ff6-40f3-a05d-347fdc6d12b3\") " pod="openshift-service-ca/service-ca-bfc587fb7-stqj4" Apr 16 16:50:05.669629 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.669601 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e23d01eb-8ff6-40f3-a05d-347fdc6d12b3-signing-key\") pod \"service-ca-bfc587fb7-stqj4\" (UID: \"e23d01eb-8ff6-40f3-a05d-347fdc6d12b3\") " pod="openshift-service-ca/service-ca-bfc587fb7-stqj4" Apr 16 16:50:05.684087 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.684056 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k4vk\" (UniqueName: \"kubernetes.io/projected/e23d01eb-8ff6-40f3-a05d-347fdc6d12b3-kube-api-access-5k4vk\") pod \"service-ca-bfc587fb7-stqj4\" (UID: \"e23d01eb-8ff6-40f3-a05d-347fdc6d12b3\") " pod="openshift-service-ca/service-ca-bfc587fb7-stqj4" Apr 16 16:50:05.768093 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.768057 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d7eab9c-0611-4981-95ed-6811b8ca42d6-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-zn9sv\" (UID: \"5d7eab9c-0611-4981-95ed-6811b8ca42d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-zn9sv" Apr 16 16:50:05.768255 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.768139 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/073fb8a8-affb-434d-9a78-e8ef0444fc11-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-xwh89\" (UID: \"073fb8a8-affb-434d-9a78-e8ef0444fc11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89" Apr 16 16:50:05.768255 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:50:05.768212 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:50:05.768255 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:50:05.768228 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:50:05.768358 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:50:05.768293 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d7eab9c-0611-4981-95ed-6811b8ca42d6-samples-operator-tls podName:5d7eab9c-0611-4981-95ed-6811b8ca42d6 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:13.768272806 +0000 UTC m=+150.438449580 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5d7eab9c-0611-4981-95ed-6811b8ca42d6-samples-operator-tls") pod "cluster-samples-operator-667775844f-zn9sv" (UID: "5d7eab9c-0611-4981-95ed-6811b8ca42d6") : secret "samples-operator-tls" not found Apr 16 16:50:05.768358 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:50:05.768315 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/073fb8a8-affb-434d-9a78-e8ef0444fc11-cluster-monitoring-operator-tls podName:073fb8a8-affb-434d-9a78-e8ef0444fc11 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:13.768305586 +0000 UTC m=+150.438482350 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/073fb8a8-affb-434d-9a78-e8ef0444fc11-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-xwh89" (UID: "073fb8a8-affb-434d-9a78-e8ef0444fc11") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:50:05.829440 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.829403 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-stqj4" Apr 16 16:50:05.943873 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.943845 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-stqj4"] Apr 16 16:50:05.946336 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:50:05.946302 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode23d01eb_8ff6_40f3_a05d_347fdc6d12b3.slice/crio-40be154761546ea5a38949edb1ff1bc73fae64958db52532226f55bdbea0a684 WatchSource:0}: Error finding container 40be154761546ea5a38949edb1ff1bc73fae64958db52532226f55bdbea0a684: Status 404 returned error can't find the container with id 40be154761546ea5a38949edb1ff1bc73fae64958db52532226f55bdbea0a684 Apr 16 16:50:05.964113 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:05.964096 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j8nrt_c39d9cc0-8cac-46bd-968d-baba878cd954/node-ca/0.log" Apr 16 16:50:06.296883 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:06.296852 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-stqj4" event={"ID":"e23d01eb-8ff6-40f3-a05d-347fdc6d12b3","Type":"ContainerStarted","Data":"7cc0fcb38158bf8c3a935cd9da725b9f3077246726ba8b98a5c1b4bb2f51c77a"} Apr 16 16:50:06.296883 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:06.296885 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-stqj4" event={"ID":"e23d01eb-8ff6-40f3-a05d-347fdc6d12b3","Type":"ContainerStarted","Data":"40be154761546ea5a38949edb1ff1bc73fae64958db52532226f55bdbea0a684"} Apr 16 16:50:06.317109 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:06.317058 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-stqj4" podStartSLOduration=1.317044971 podStartE2EDuration="1.317044971s" podCreationTimestamp="2026-04-16 16:50:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:50:06.315242702 +0000 UTC m=+142.985419483" watchObservedRunningTime="2026-04-16 16:50:06.317044971 +0000 UTC m=+142.987221749" Apr 16 16:50:13.828323 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:13.828263 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d7eab9c-0611-4981-95ed-6811b8ca42d6-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-zn9sv\" (UID: \"5d7eab9c-0611-4981-95ed-6811b8ca42d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-zn9sv" Apr 16 16:50:13.828840 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:13.828390 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/073fb8a8-affb-434d-9a78-e8ef0444fc11-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-xwh89\" (UID: \"073fb8a8-affb-434d-9a78-e8ef0444fc11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89" Apr 16 16:50:13.828840 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:50:13.828508 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:50:13.828840 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:50:13.828606 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/073fb8a8-affb-434d-9a78-e8ef0444fc11-cluster-monitoring-operator-tls podName:073fb8a8-affb-434d-9a78-e8ef0444fc11 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:29.828584995 +0000 UTC m=+166.498761752 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/073fb8a8-affb-434d-9a78-e8ef0444fc11-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-xwh89" (UID: "073fb8a8-affb-434d-9a78-e8ef0444fc11") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:50:13.830871 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:13.830850 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5d7eab9c-0611-4981-95ed-6811b8ca42d6-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-zn9sv\" (UID: \"5d7eab9c-0611-4981-95ed-6811b8ca42d6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-zn9sv" Apr 16 16:50:13.892250 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:13.892224 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-zn9sv" Apr 16 16:50:14.005331 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:14.005298 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-zn9sv"] Apr 16 16:50:14.313292 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:14.313249 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-zn9sv" event={"ID":"5d7eab9c-0611-4981-95ed-6811b8ca42d6","Type":"ContainerStarted","Data":"ad2eacb845d3648535d1387a88b94cac7f8d042bac1f13c288ec6f8342a034c3"} Apr 16 16:50:16.320175 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:16.320136 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-zn9sv" event={"ID":"5d7eab9c-0611-4981-95ed-6811b8ca42d6","Type":"ContainerStarted","Data":"d751f101a5289c0d5d07dcb8b50530caee4141d6a5bd140fde7f012afad664c5"} Apr 16 16:50:16.320175 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:16.320173 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-zn9sv" event={"ID":"5d7eab9c-0611-4981-95ed-6811b8ca42d6","Type":"ContainerStarted","Data":"e8418655e9666fb23ccad525e6b7819151604b0042cd446c028e492deaaf1d25"} Apr 16 16:50:16.339458 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:16.339407 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-zn9sv" podStartSLOduration=17.568893666 podStartE2EDuration="19.339391659s" podCreationTimestamp="2026-04-16 16:49:57 +0000 UTC" firstStartedPulling="2026-04-16 16:50:14.046391308 +0000 UTC m=+150.716568064" lastFinishedPulling="2026-04-16 16:50:15.816889288 +0000 UTC m=+152.487066057" observedRunningTime="2026-04-16 16:50:16.339127633 +0000 UTC m=+153.009304412" watchObservedRunningTime="2026-04-16 16:50:16.339391659 +0000 UTC m=+153.009568441" Apr 16 16:50:19.670079 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:50:19.670034 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" podUID="a06f9194-8eb8-465c-a248-cdeff2ea3ec9" Apr 16 16:50:19.682284 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:50:19.682249 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-ndbsz" podUID="e9c730fc-1a8b-4c50-92f3-2ebfd693c270" Apr 16 16:50:19.706457 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:50:19.706424 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-4nrrn" podUID="af4910d1-f39b-44e2-805e-bcc17c0e30d0" Apr 16 16:50:19.940819 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:50:19.940719 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-5rwjz" podUID="65f280f9-caf6-429e-ac03-31bd647a05b6" Apr 16 16:50:20.330080 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:20.330050 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:50:20.330245 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:20.330112 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ndbsz" Apr 16 16:50:24.611197 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:24.611159 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls\") pod \"dns-default-ndbsz\" (UID: \"e9c730fc-1a8b-4c50-92f3-2ebfd693c270\") " pod="openshift-dns/dns-default-ndbsz" Apr 16 16:50:24.611621 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:24.611221 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:50:24.613457 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:24.613437 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e9c730fc-1a8b-4c50-92f3-2ebfd693c270-metrics-tls\") pod \"dns-default-ndbsz\" (UID: \"e9c730fc-1a8b-4c50-92f3-2ebfd693c270\") " pod="openshift-dns/dns-default-ndbsz" Apr 16 16:50:24.613655 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:24.613638 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls\") pod \"image-registry-66f6866d4-g7mhz\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:50:24.711848 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:24.711816 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert\") pod \"ingress-canary-4nrrn\" (UID: \"af4910d1-f39b-44e2-805e-bcc17c0e30d0\") " pod="openshift-ingress-canary/ingress-canary-4nrrn" Apr 16 16:50:24.714134 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:24.714104 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/af4910d1-f39b-44e2-805e-bcc17c0e30d0-cert\") pod \"ingress-canary-4nrrn\" (UID: \"af4910d1-f39b-44e2-805e-bcc17c0e30d0\") " pod="openshift-ingress-canary/ingress-canary-4nrrn" Apr 16 16:50:24.834344 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:24.834315 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-z7whv\"" Apr 16 16:50:24.834344 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:24.834315 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-zkp2p\"" Apr 16 16:50:24.840667 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:24.840640 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:50:24.840737 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:24.840682 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ndbsz" Apr 16 16:50:24.961998 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:24.961976 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ndbsz"] Apr 16 16:50:24.964610 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:50:24.964583 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9c730fc_1a8b_4c50_92f3_2ebfd693c270.slice/crio-1e975d460693819ef1693443cf5a1f9e45648fb9453649876561dcc652e6cf48 WatchSource:0}: Error finding container 1e975d460693819ef1693443cf5a1f9e45648fb9453649876561dcc652e6cf48: Status 404 returned error can't find the container with id 1e975d460693819ef1693443cf5a1f9e45648fb9453649876561dcc652e6cf48 Apr 16 16:50:24.980794 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:24.980771 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66f6866d4-g7mhz"] Apr 16 16:50:24.983090 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:50:24.983067 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda06f9194_8eb8_465c_a248_cdeff2ea3ec9.slice/crio-7650009c6017bab8b84ea4b76da4ea91df610d8ffb6ba3bfe7c32df5c15c2f5c WatchSource:0}: Error finding container 7650009c6017bab8b84ea4b76da4ea91df610d8ffb6ba3bfe7c32df5c15c2f5c: Status 404 returned error can't find the container with id 7650009c6017bab8b84ea4b76da4ea91df610d8ffb6ba3bfe7c32df5c15c2f5c Apr 16 16:50:25.347336 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.347295 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ndbsz" event={"ID":"e9c730fc-1a8b-4c50-92f3-2ebfd693c270","Type":"ContainerStarted","Data":"1e975d460693819ef1693443cf5a1f9e45648fb9453649876561dcc652e6cf48"} Apr 16 16:50:25.348781 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.348742 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" event={"ID":"a06f9194-8eb8-465c-a248-cdeff2ea3ec9","Type":"ContainerStarted","Data":"e51a7557399529693ae4a60592003c9810a423e56a4369f21a735e55a6898bc5"} Apr 16 16:50:25.348781 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.348777 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" event={"ID":"a06f9194-8eb8-465c-a248-cdeff2ea3ec9","Type":"ContainerStarted","Data":"7650009c6017bab8b84ea4b76da4ea91df610d8ffb6ba3bfe7c32df5c15c2f5c"} Apr 16 16:50:25.348975 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.348919 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:50:25.392399 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.392335 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" podStartSLOduration=168.392321096 podStartE2EDuration="2m48.392321096s" podCreationTimestamp="2026-04-16 16:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:50:25.390560286 +0000 UTC m=+162.060737066" watchObservedRunningTime="2026-04-16 16:50:25.392321096 +0000 UTC m=+162.062497875" Apr 16 16:50:25.445055 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.445018 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-ctpwk"] Apr 16 16:50:25.449644 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.449617 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ctpwk" Apr 16 16:50:25.453034 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.453008 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 16:50:25.453398 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.453244 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-w2v8f\"" Apr 16 16:50:25.453398 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.453286 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 16:50:25.453398 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.453395 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 16:50:25.453919 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.453894 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 16:50:25.462780 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.462758 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ctpwk"] Apr 16 16:50:25.471215 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.471191 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66f6866d4-g7mhz"] Apr 16 16:50:25.507080 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.507030 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7655577c6d-7np48"] Apr 16 16:50:25.510207 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.510187 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.518108 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.518085 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8123167b-2df8-460e-ab9c-5bdd6b4a099f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ctpwk\" (UID: \"8123167b-2df8-460e-ab9c-5bdd6b4a099f\") " pod="openshift-insights/insights-runtime-extractor-ctpwk" Apr 16 16:50:25.518232 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.518123 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6qwd\" (UniqueName: \"kubernetes.io/projected/8123167b-2df8-460e-ab9c-5bdd6b4a099f-kube-api-access-m6qwd\") pod \"insights-runtime-extractor-ctpwk\" (UID: \"8123167b-2df8-460e-ab9c-5bdd6b4a099f\") " pod="openshift-insights/insights-runtime-extractor-ctpwk" Apr 16 16:50:25.518396 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.518175 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8123167b-2df8-460e-ab9c-5bdd6b4a099f-crio-socket\") pod \"insights-runtime-extractor-ctpwk\" (UID: \"8123167b-2df8-460e-ab9c-5bdd6b4a099f\") " pod="openshift-insights/insights-runtime-extractor-ctpwk" Apr 16 16:50:25.518481 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.518452 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8123167b-2df8-460e-ab9c-5bdd6b4a099f-data-volume\") pod \"insights-runtime-extractor-ctpwk\" (UID: \"8123167b-2df8-460e-ab9c-5bdd6b4a099f\") " pod="openshift-insights/insights-runtime-extractor-ctpwk" Apr 16 16:50:25.518559 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.518499 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8123167b-2df8-460e-ab9c-5bdd6b4a099f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ctpwk\" (UID: \"8123167b-2df8-460e-ab9c-5bdd6b4a099f\") " pod="openshift-insights/insights-runtime-extractor-ctpwk" Apr 16 16:50:25.519534 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.519501 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7655577c6d-7np48"] Apr 16 16:50:25.619346 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.619265 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-bound-sa-token\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.619346 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.619319 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-registry-certificates\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.619851 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.619373 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8123167b-2df8-460e-ab9c-5bdd6b4a099f-crio-socket\") pod \"insights-runtime-extractor-ctpwk\" (UID: \"8123167b-2df8-460e-ab9c-5bdd6b4a099f\") " pod="openshift-insights/insights-runtime-extractor-ctpwk" Apr 16 16:50:25.619851 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.619412 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8123167b-2df8-460e-ab9c-5bdd6b4a099f-data-volume\") pod \"insights-runtime-extractor-ctpwk\" (UID: \"8123167b-2df8-460e-ab9c-5bdd6b4a099f\") " pod="openshift-insights/insights-runtime-extractor-ctpwk" Apr 16 16:50:25.619851 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.619436 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-registry-tls\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.619851 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.619481 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8123167b-2df8-460e-ab9c-5bdd6b4a099f-crio-socket\") pod \"insights-runtime-extractor-ctpwk\" (UID: \"8123167b-2df8-460e-ab9c-5bdd6b4a099f\") " pod="openshift-insights/insights-runtime-extractor-ctpwk" Apr 16 16:50:25.619851 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.619483 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-ca-trust-extracted\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.619851 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.619554 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-trusted-ca\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.619851 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.619586 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-image-registry-private-configuration\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.619851 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.619620 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8123167b-2df8-460e-ab9c-5bdd6b4a099f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ctpwk\" (UID: \"8123167b-2df8-460e-ab9c-5bdd6b4a099f\") " pod="openshift-insights/insights-runtime-extractor-ctpwk" Apr 16 16:50:25.619851 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.619653 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-installation-pull-secrets\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.619851 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.619678 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhtvc\" (UniqueName: \"kubernetes.io/projected/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-kube-api-access-qhtvc\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.619851 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.619744 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8123167b-2df8-460e-ab9c-5bdd6b4a099f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ctpwk\" (UID: \"8123167b-2df8-460e-ab9c-5bdd6b4a099f\") " pod="openshift-insights/insights-runtime-extractor-ctpwk" Apr 16 16:50:25.619851 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.619774 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6qwd\" (UniqueName: \"kubernetes.io/projected/8123167b-2df8-460e-ab9c-5bdd6b4a099f-kube-api-access-m6qwd\") pod \"insights-runtime-extractor-ctpwk\" (UID: \"8123167b-2df8-460e-ab9c-5bdd6b4a099f\") " pod="openshift-insights/insights-runtime-extractor-ctpwk" Apr 16 16:50:25.619851 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.619826 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8123167b-2df8-460e-ab9c-5bdd6b4a099f-data-volume\") pod \"insights-runtime-extractor-ctpwk\" (UID: \"8123167b-2df8-460e-ab9c-5bdd6b4a099f\") " pod="openshift-insights/insights-runtime-extractor-ctpwk" Apr 16 16:50:25.621049 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.621025 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8123167b-2df8-460e-ab9c-5bdd6b4a099f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ctpwk\" (UID: \"8123167b-2df8-460e-ab9c-5bdd6b4a099f\") " pod="openshift-insights/insights-runtime-extractor-ctpwk" Apr 16 16:50:25.623499 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.623468 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8123167b-2df8-460e-ab9c-5bdd6b4a099f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ctpwk\" (UID: \"8123167b-2df8-460e-ab9c-5bdd6b4a099f\") " pod="openshift-insights/insights-runtime-extractor-ctpwk" Apr 16 16:50:25.631157 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.631135 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6qwd\" (UniqueName: \"kubernetes.io/projected/8123167b-2df8-460e-ab9c-5bdd6b4a099f-kube-api-access-m6qwd\") pod \"insights-runtime-extractor-ctpwk\" (UID: \"8123167b-2df8-460e-ab9c-5bdd6b4a099f\") " pod="openshift-insights/insights-runtime-extractor-ctpwk" Apr 16 16:50:25.720883 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.720844 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhtvc\" (UniqueName: \"kubernetes.io/projected/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-kube-api-access-qhtvc\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.721062 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.720919 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-bound-sa-token\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.721062 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.720948 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-registry-certificates\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.721062 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.720993 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-registry-tls\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.721062 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.721015 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-ca-trust-extracted\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.721062 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.721035 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-trusted-ca\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.721062 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.721059 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-image-registry-private-configuration\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.721346 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.721087 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-installation-pull-secrets\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.722449 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.721798 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-ca-trust-extracted\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.722449 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.722208 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-registry-certificates\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.722449 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.722397 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-trusted-ca\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.724065 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.724044 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-registry-tls\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.724167 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.724042 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-image-registry-private-configuration\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.724311 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.724288 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-installation-pull-secrets\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.729413 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.729385 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-bound-sa-token\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.729510 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.729471 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhtvc\" (UniqueName: \"kubernetes.io/projected/9d1ef785-865a-4cf2-b57e-d3b55eea5a1c-kube-api-access-qhtvc\") pod \"image-registry-7655577c6d-7np48\" (UID: \"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c\") " pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.760114 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.760079 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ctpwk" Apr 16 16:50:25.822348 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.822029 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:25.887081 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:25.886698 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ctpwk"] Apr 16 16:50:26.162328 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:50:26.162236 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8123167b_2df8_460e_ab9c_5bdd6b4a099f.slice/crio-7f20509b4b8a01cbc572cc3e2b5515fa13ccb48a996f979719c55cfeaca4bc74 WatchSource:0}: Error finding container 7f20509b4b8a01cbc572cc3e2b5515fa13ccb48a996f979719c55cfeaca4bc74: Status 404 returned error can't find the container with id 7f20509b4b8a01cbc572cc3e2b5515fa13ccb48a996f979719c55cfeaca4bc74 Apr 16 16:50:26.289664 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:26.289606 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7655577c6d-7np48"] Apr 16 16:50:26.293401 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:50:26.293367 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d1ef785_865a_4cf2_b57e_d3b55eea5a1c.slice/crio-27e4d4afb183aa0c79e463b4fb23f8307228fd0b4795859ab341380730f660cd WatchSource:0}: Error finding container 27e4d4afb183aa0c79e463b4fb23f8307228fd0b4795859ab341380730f660cd: Status 404 returned error can't find the container with id 27e4d4afb183aa0c79e463b4fb23f8307228fd0b4795859ab341380730f660cd Apr 16 16:50:26.360076 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:26.359629 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ctpwk" event={"ID":"8123167b-2df8-460e-ab9c-5bdd6b4a099f","Type":"ContainerStarted","Data":"8c8abfb1b4807ba0ed9c4b53980d60a553e952fd6fdc26357db4e1047a679799"} Apr 16 16:50:26.360076 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:26.359677 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ctpwk" event={"ID":"8123167b-2df8-460e-ab9c-5bdd6b4a099f","Type":"ContainerStarted","Data":"7f20509b4b8a01cbc572cc3e2b5515fa13ccb48a996f979719c55cfeaca4bc74"} Apr 16 16:50:26.361590 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:26.361560 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ndbsz" event={"ID":"e9c730fc-1a8b-4c50-92f3-2ebfd693c270","Type":"ContainerStarted","Data":"75bf0c49b60f75d866ef0e76f7d6ef92854da3bb65d8ed19a33daeed263e2f5f"} Apr 16 16:50:26.363158 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:26.363132 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7655577c6d-7np48" event={"ID":"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c","Type":"ContainerStarted","Data":"08fc301124152bfc74a0acdca7c83e1de57ba9159ea69edd991f71fce090d6a4"} Apr 16 16:50:26.363251 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:26.363163 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7655577c6d-7np48" event={"ID":"9d1ef785-865a-4cf2-b57e-d3b55eea5a1c","Type":"ContainerStarted","Data":"27e4d4afb183aa0c79e463b4fb23f8307228fd0b4795859ab341380730f660cd"} Apr 16 16:50:26.383943 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:26.383887 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7655577c6d-7np48" podStartSLOduration=1.38386865 podStartE2EDuration="1.38386865s" podCreationTimestamp="2026-04-16 16:50:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:50:26.383503143 +0000 UTC m=+163.053679923" watchObservedRunningTime="2026-04-16 16:50:26.38386865 +0000 UTC m=+163.054045429" Apr 16 16:50:27.368067 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:27.368026 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ctpwk" event={"ID":"8123167b-2df8-460e-ab9c-5bdd6b4a099f","Type":"ContainerStarted","Data":"0d3a56ac64ea2e02cee7ae5f2ccbf5c3dbf7e42bb4dbdff5772a5b2d03a073c9"} Apr 16 16:50:27.369713 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:27.369681 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ndbsz" event={"ID":"e9c730fc-1a8b-4c50-92f3-2ebfd693c270","Type":"ContainerStarted","Data":"3b6303646b91d2e0e764431f902c735e82392522dd63a3507faffc6cd3303038"} Apr 16 16:50:27.370270 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:27.370217 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:27.370270 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:27.370250 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-ndbsz" Apr 16 16:50:27.388894 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:27.388692 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ndbsz" podStartSLOduration=130.142463061 podStartE2EDuration="2m11.388675271s" podCreationTimestamp="2026-04-16 16:48:16 +0000 UTC" firstStartedPulling="2026-04-16 16:50:24.966514095 +0000 UTC m=+161.636690864" lastFinishedPulling="2026-04-16 16:50:26.212726301 +0000 UTC m=+162.882903074" observedRunningTime="2026-04-16 16:50:27.388288951 +0000 UTC m=+164.058465730" watchObservedRunningTime="2026-04-16 16:50:27.388675271 +0000 UTC m=+164.058852053" Apr 16 16:50:28.374417 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:28.374326 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ctpwk" event={"ID":"8123167b-2df8-460e-ab9c-5bdd6b4a099f","Type":"ContainerStarted","Data":"6a0bc0ce7dae5cba00267034175d616b3748506beaf6bf08449c0240024baa0f"} Apr 16 16:50:28.391368 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:28.391297 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-ctpwk" podStartSLOduration=1.5129845610000001 podStartE2EDuration="3.391282102s" podCreationTimestamp="2026-04-16 16:50:25 +0000 UTC" firstStartedPulling="2026-04-16 16:50:26.221510629 +0000 UTC m=+162.891687385" lastFinishedPulling="2026-04-16 16:50:28.099808164 +0000 UTC m=+164.769984926" observedRunningTime="2026-04-16 16:50:28.39058699 +0000 UTC m=+165.060763768" watchObservedRunningTime="2026-04-16 16:50:28.391282102 +0000 UTC m=+165.061458882" Apr 16 16:50:29.857614 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:29.857558 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/073fb8a8-affb-434d-9a78-e8ef0444fc11-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-xwh89\" (UID: \"073fb8a8-affb-434d-9a78-e8ef0444fc11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89" Apr 16 16:50:29.860005 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:29.859978 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/073fb8a8-affb-434d-9a78-e8ef0444fc11-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-xwh89\" (UID: \"073fb8a8-affb-434d-9a78-e8ef0444fc11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89" Apr 16 16:50:30.097096 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:30.097059 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89" Apr 16 16:50:30.211777 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:30.211729 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89"] Apr 16 16:50:30.216102 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:50:30.216078 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod073fb8a8_affb_434d_9a78_e8ef0444fc11.slice/crio-a65fd8c66568e73bc249a0bc968045c1ad271c8075d89938fa838e89d8d188b4 WatchSource:0}: Error finding container a65fd8c66568e73bc249a0bc968045c1ad271c8075d89938fa838e89d8d188b4: Status 404 returned error can't find the container with id a65fd8c66568e73bc249a0bc968045c1ad271c8075d89938fa838e89d8d188b4 Apr 16 16:50:30.380160 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:30.380123 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89" event={"ID":"073fb8a8-affb-434d-9a78-e8ef0444fc11","Type":"ContainerStarted","Data":"a65fd8c66568e73bc249a0bc968045c1ad271c8075d89938fa838e89d8d188b4"} Apr 16 16:50:32.387087 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:32.387050 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89" event={"ID":"073fb8a8-affb-434d-9a78-e8ef0444fc11","Type":"ContainerStarted","Data":"68981d37748b20dfbaf7f46e56f9a27b9bf02180ea3b022b2cb740530361595f"} Apr 16 16:50:32.404714 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:32.404663 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-xwh89" podStartSLOduration=33.981111475 podStartE2EDuration="35.404650761s" podCreationTimestamp="2026-04-16 16:49:57 +0000 UTC" firstStartedPulling="2026-04-16 16:50:30.218406651 +0000 UTC m=+166.888583408" lastFinishedPulling="2026-04-16 16:50:31.64194593 +0000 UTC m=+168.312122694" observedRunningTime="2026-04-16 16:50:32.404261096 +0000 UTC m=+169.074437875" watchObservedRunningTime="2026-04-16 16:50:32.404650761 +0000 UTC m=+169.074827539" Apr 16 16:50:33.931147 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:33.931111 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:50:33.931558 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:33.931207 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4nrrn" Apr 16 16:50:33.934370 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:33.934350 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-s9vpx\"" Apr 16 16:50:33.941916 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:33.941896 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4nrrn" Apr 16 16:50:34.060804 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:34.060767 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4nrrn"] Apr 16 16:50:34.065212 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:50:34.065185 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf4910d1_f39b_44e2_805e_bcc17c0e30d0.slice/crio-e76faaf1b52066ab9389ebe8323dcb632c1a9e9320d98468a6fc5fd0b3384375 WatchSource:0}: Error finding container e76faaf1b52066ab9389ebe8323dcb632c1a9e9320d98468a6fc5fd0b3384375: Status 404 returned error can't find the container with id e76faaf1b52066ab9389ebe8323dcb632c1a9e9320d98468a6fc5fd0b3384375 Apr 16 16:50:34.392824 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:34.392793 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4nrrn" event={"ID":"af4910d1-f39b-44e2-805e-bcc17c0e30d0","Type":"ContainerStarted","Data":"e76faaf1b52066ab9389ebe8323dcb632c1a9e9320d98468a6fc5fd0b3384375"} Apr 16 16:50:36.398574 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:36.398533 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4nrrn" event={"ID":"af4910d1-f39b-44e2-805e-bcc17c0e30d0","Type":"ContainerStarted","Data":"1ff5b8efc8a7b7a055c33ddafc69f5cb19041af491724581cf606bbc4c8ec529"} Apr 16 16:50:36.413865 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:36.413818 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4nrrn" podStartSLOduration=138.960976326 podStartE2EDuration="2m20.41380424s" podCreationTimestamp="2026-04-16 16:48:16 +0000 UTC" firstStartedPulling="2026-04-16 16:50:34.067094706 +0000 UTC m=+170.737271475" lastFinishedPulling="2026-04-16 16:50:35.519922619 +0000 UTC m=+172.190099389" observedRunningTime="2026-04-16 16:50:36.413257722 +0000 UTC m=+173.083434501" watchObservedRunningTime="2026-04-16 16:50:36.41380424 +0000 UTC m=+173.083981011" Apr 16 16:50:37.651429 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.651397 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9g78g"] Apr 16 16:50:37.656633 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.656608 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.659708 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.659686 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 16:50:37.659877 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.659825 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 16:50:37.660184 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.660159 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 16:50:37.661309 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.661291 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-hp9nk\"" Apr 16 16:50:37.661413 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.661290 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 16:50:37.722806 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.722764 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2m7z\" (UniqueName: \"kubernetes.io/projected/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-kube-api-access-h2m7z\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.722962 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.722865 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-root\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.722962 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.722899 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-node-exporter-textfile\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.722962 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.722931 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-sys\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.723115 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.722964 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-node-exporter-accelerators-collector-config\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.723115 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.723012 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-node-exporter-tls\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.723115 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.723036 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-node-exporter-wtmp\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.723115 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.723061 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.723115 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.723110 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-metrics-client-ca\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.824395 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.824361 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-root\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.824395 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.824397 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-node-exporter-textfile\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.824678 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.824426 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-sys\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.824678 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.824477 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-sys\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.824678 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.824491 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-root\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.824678 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.824505 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-node-exporter-accelerators-collector-config\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.824678 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.824588 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-node-exporter-tls\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.824678 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.824624 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-node-exporter-wtmp\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.824678 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.824653 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.825022 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.824716 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-metrics-client-ca\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.825022 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.824744 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2m7z\" (UniqueName: \"kubernetes.io/projected/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-kube-api-access-h2m7z\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.825022 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.824761 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-node-exporter-textfile\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.825022 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.824782 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-node-exporter-wtmp\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.825212 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.825114 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-node-exporter-accelerators-collector-config\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.825212 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.825168 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-metrics-client-ca\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.827173 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.827148 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.827306 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.827226 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-node-exporter-tls\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.833802 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.833779 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2m7z\" (UniqueName: \"kubernetes.io/projected/4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e-kube-api-access-h2m7z\") pod \"node-exporter-9g78g\" (UID: \"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e\") " pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.968625 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:37.968540 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9g78g" Apr 16 16:50:37.976329 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:50:37.976294 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e1fd5dc_ed1e_40be_a46a_6c2e9a2ef55e.slice/crio-71355dabba7aab718015e5dd6a50c67fb1c5cbf24dfb85c64c89136d400581dc WatchSource:0}: Error finding container 71355dabba7aab718015e5dd6a50c67fb1c5cbf24dfb85c64c89136d400581dc: Status 404 returned error can't find the container with id 71355dabba7aab718015e5dd6a50c67fb1c5cbf24dfb85c64c89136d400581dc Apr 16 16:50:38.378867 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:38.378825 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ndbsz" Apr 16 16:50:38.405060 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:38.405029 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9g78g" event={"ID":"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e","Type":"ContainerStarted","Data":"71355dabba7aab718015e5dd6a50c67fb1c5cbf24dfb85c64c89136d400581dc"} Apr 16 16:50:39.408667 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:39.408637 2568 generic.go:358] "Generic (PLEG): container finished" podID="4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e" containerID="5566c43b3726219b2e06c7ea7950087e7ea5998b28c39692d046f46ccc5da906" exitCode=0 Apr 16 16:50:39.409033 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:39.408697 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9g78g" event={"ID":"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e","Type":"ContainerDied","Data":"5566c43b3726219b2e06c7ea7950087e7ea5998b28c39692d046f46ccc5da906"} Apr 16 16:50:40.412820 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:40.412780 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9g78g" event={"ID":"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e","Type":"ContainerStarted","Data":"b818f942567471d34f9cff9406f4e59f35ee41859195a7c1b827e119be28eb54"} Apr 16 16:50:40.412820 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:40.412818 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9g78g" event={"ID":"4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e","Type":"ContainerStarted","Data":"f11c8becbbcafda57ad335d29b9d9c77e27a0d6eb682ee61797de5dd251a42db"} Apr 16 16:50:40.440017 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:40.439966 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9g78g" podStartSLOduration=2.751523648 podStartE2EDuration="3.439951345s" podCreationTimestamp="2026-04-16 16:50:37 +0000 UTC" firstStartedPulling="2026-04-16 16:50:37.978085703 +0000 UTC m=+174.648262464" lastFinishedPulling="2026-04-16 16:50:38.666513404 +0000 UTC m=+175.336690161" observedRunningTime="2026-04-16 16:50:40.438034989 +0000 UTC m=+177.108211805" watchObservedRunningTime="2026-04-16 16:50:40.439951345 +0000 UTC m=+177.110128123" Apr 16 16:50:42.322906 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:42.322869 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-nxqlt"] Apr 16 16:50:42.325946 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:42.325930 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-nxqlt" Apr 16 16:50:42.328584 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:42.328565 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 16:50:42.328653 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:42.328596 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-6pttv\"" Apr 16 16:50:42.335259 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:42.335234 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-nxqlt"] Apr 16 16:50:42.460677 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:42.460642 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a8fa4d4b-1853-466d-a89a-f197d586e400-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-nxqlt\" (UID: \"a8fa4d4b-1853-466d-a89a-f197d586e400\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-nxqlt" Apr 16 16:50:42.562013 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:42.561974 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a8fa4d4b-1853-466d-a89a-f197d586e400-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-nxqlt\" (UID: \"a8fa4d4b-1853-466d-a89a-f197d586e400\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-nxqlt" Apr 16 16:50:42.562180 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:50:42.562131 2568 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 16:50:42.562220 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:50:42.562198 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8fa4d4b-1853-466d-a89a-f197d586e400-monitoring-plugin-cert podName:a8fa4d4b-1853-466d-a89a-f197d586e400 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:43.062178879 +0000 UTC m=+179.732355636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/a8fa4d4b-1853-466d-a89a-f197d586e400-monitoring-plugin-cert") pod "monitoring-plugin-5876b4bbc7-nxqlt" (UID: "a8fa4d4b-1853-466d-a89a-f197d586e400") : secret "monitoring-plugin-cert" not found Apr 16 16:50:43.066945 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.066906 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a8fa4d4b-1853-466d-a89a-f197d586e400-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-nxqlt\" (UID: \"a8fa4d4b-1853-466d-a89a-f197d586e400\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-nxqlt" Apr 16 16:50:43.069385 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.069351 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a8fa4d4b-1853-466d-a89a-f197d586e400-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-nxqlt\" (UID: \"a8fa4d4b-1853-466d-a89a-f197d586e400\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-nxqlt" Apr 16 16:50:43.235533 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.235488 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-nxqlt" Apr 16 16:50:43.347731 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.347654 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-nxqlt"] Apr 16 16:50:43.351107 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:50:43.351079 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8fa4d4b_1853_466d_a89a_f197d586e400.slice/crio-c7e2b6ddad45b3259e7cb63850011fd097fa0c09d3a91ebc6538beae95d1332f WatchSource:0}: Error finding container c7e2b6ddad45b3259e7cb63850011fd097fa0c09d3a91ebc6538beae95d1332f: Status 404 returned error can't find the container with id c7e2b6ddad45b3259e7cb63850011fd097fa0c09d3a91ebc6538beae95d1332f Apr 16 16:50:43.422580 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.422541 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-nxqlt" event={"ID":"a8fa4d4b-1853-466d-a89a-f197d586e400","Type":"ContainerStarted","Data":"c7e2b6ddad45b3259e7cb63850011fd097fa0c09d3a91ebc6538beae95d1332f"} Apr 16 16:50:43.773236 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.773195 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:50:43.778147 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.778119 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.781083 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.781061 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 16:50:43.781083 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.781081 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3eh9lmjtm83rd\"" Apr 16 16:50:43.781297 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.781283 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 16:50:43.781814 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.781798 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 16:50:43.781814 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.781806 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 16:50:43.781960 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.781816 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 16:50:43.782079 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.782061 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 16:50:43.782432 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.782416 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 16:50:43.782484 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.782432 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 16:50:43.783961 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.783876 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-vd5ts\"" Apr 16 16:50:43.783961 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.783934 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 16:50:43.783961 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.783945 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 16:50:43.784188 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.783958 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 16:50:43.784260 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.784189 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 16:50:43.786031 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.786010 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 16:50:43.798818 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.798791 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:50:43.873136 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.873100 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/12495802-0b96-4046-a489-89800a54f412-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.873319 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.873176 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.873319 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.873213 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.873319 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.873242 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.873319 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.873274 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.873319 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.873303 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/12495802-0b96-4046-a489-89800a54f412-config-out\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.873590 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.873344 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn8wj\" (UniqueName: \"kubernetes.io/projected/12495802-0b96-4046-a489-89800a54f412-kube-api-access-sn8wj\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.873590 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.873373 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.873590 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.873406 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-config\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.873590 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.873430 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.873590 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.873462 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-web-config\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.873590 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.873487 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/12495802-0b96-4046-a489-89800a54f412-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.873590 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.873505 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.873590 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.873563 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.873590 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.873593 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.873891 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.873616 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.873891 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.873647 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.873891 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.873669 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.974609 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.974571 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-config\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.974799 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.974623 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.974799 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.974662 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-web-config\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.974799 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.974691 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/12495802-0b96-4046-a489-89800a54f412-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.974799 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.974717 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.974799 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.974747 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.974799 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.974772 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.975097 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.974811 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.975097 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.974842 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.975097 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.974869 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.975097 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.974900 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/12495802-0b96-4046-a489-89800a54f412-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.975097 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.974983 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.975097 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.975019 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.975097 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.975052 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.975097 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.975090 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.980588 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.975125 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/12495802-0b96-4046-a489-89800a54f412-config-out\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.980588 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.975164 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn8wj\" (UniqueName: \"kubernetes.io/projected/12495802-0b96-4046-a489-89800a54f412-kube-api-access-sn8wj\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.980588 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.975196 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.980588 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.975886 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.980588 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.976943 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.980588 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.977606 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.980588 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.977921 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/12495802-0b96-4046-a489-89800a54f412-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.980588 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.979095 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.980588 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.980411 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.981145 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.980603 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.981145 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.980839 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.981145 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.981000 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-web-config\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.981374 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.981315 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.981537 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.981428 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.981537 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.981465 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.982377 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.982322 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-config\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.982377 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.982354 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.983119 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.983098 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.984002 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.983957 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/12495802-0b96-4046-a489-89800a54f412-config-out\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.984099 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.984014 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/12495802-0b96-4046-a489-89800a54f412-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:43.989615 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:43.989590 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn8wj\" (UniqueName: \"kubernetes.io/projected/12495802-0b96-4046-a489-89800a54f412-kube-api-access-sn8wj\") pod \"prometheus-k8s-0\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:44.087851 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:44.087781 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:50:44.226132 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:44.226092 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:50:44.503372 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:50:44.503342 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12495802_0b96_4046_a489_89800a54f412.slice/crio-2f50bf219298fceb58db46bd4e15d35973e151110fb76bd296d4287cafc0d8bb WatchSource:0}: Error finding container 2f50bf219298fceb58db46bd4e15d35973e151110fb76bd296d4287cafc0d8bb: Status 404 returned error can't find the container with id 2f50bf219298fceb58db46bd4e15d35973e151110fb76bd296d4287cafc0d8bb Apr 16 16:50:45.429414 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:45.429386 2568 generic.go:358] "Generic (PLEG): container finished" podID="12495802-0b96-4046-a489-89800a54f412" containerID="ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490" exitCode=0 Apr 16 16:50:45.429559 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:45.429471 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12495802-0b96-4046-a489-89800a54f412","Type":"ContainerDied","Data":"ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490"} Apr 16 16:50:45.429559 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:45.429504 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12495802-0b96-4046-a489-89800a54f412","Type":"ContainerStarted","Data":"2f50bf219298fceb58db46bd4e15d35973e151110fb76bd296d4287cafc0d8bb"} Apr 16 16:50:45.431482 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:45.431122 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-nxqlt" event={"ID":"a8fa4d4b-1853-466d-a89a-f197d586e400","Type":"ContainerStarted","Data":"ff595830f6de542277762533a1575ca2ecf33dacd8e846d7bc3ce9cb330e8bfc"} Apr 16 16:50:45.434454 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:45.431793 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-nxqlt" Apr 16 16:50:45.438603 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:45.438583 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-nxqlt" Apr 16 16:50:45.471682 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:45.471636 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-nxqlt" podStartSLOduration=2.275495516 podStartE2EDuration="3.471621615s" podCreationTimestamp="2026-04-16 16:50:42 +0000 UTC" firstStartedPulling="2026-04-16 16:50:43.35295945 +0000 UTC m=+180.023136211" lastFinishedPulling="2026-04-16 16:50:44.549085534 +0000 UTC m=+181.219262310" observedRunningTime="2026-04-16 16:50:45.471104078 +0000 UTC m=+182.141280860" watchObservedRunningTime="2026-04-16 16:50:45.471621615 +0000 UTC m=+182.141798393" Apr 16 16:50:45.826986 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:45.826942 2568 patch_prober.go:28] interesting pod/image-registry-7655577c6d-7np48 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 16:50:45.827409 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:45.827011 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7655577c6d-7np48" podUID="9d1ef785-865a-4cf2-b57e-d3b55eea5a1c" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:50:47.377189 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:47.377159 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:50:48.378909 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:48.378881 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7655577c6d-7np48" Apr 16 16:50:48.442433 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:48.442401 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12495802-0b96-4046-a489-89800a54f412","Type":"ContainerStarted","Data":"73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a"} Apr 16 16:50:48.442601 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:48.442443 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12495802-0b96-4046-a489-89800a54f412","Type":"ContainerStarted","Data":"48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac"} Apr 16 16:50:50.453658 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:50.453619 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12495802-0b96-4046-a489-89800a54f412","Type":"ContainerStarted","Data":"4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817"} Apr 16 16:50:50.453658 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:50.453662 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12495802-0b96-4046-a489-89800a54f412","Type":"ContainerStarted","Data":"6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4"} Apr 16 16:50:50.454075 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:50.453679 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12495802-0b96-4046-a489-89800a54f412","Type":"ContainerStarted","Data":"9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d"} Apr 16 16:50:50.454075 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:50.453691 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12495802-0b96-4046-a489-89800a54f412","Type":"ContainerStarted","Data":"afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c"} Apr 16 16:50:50.486667 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:50.486614 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.559557736 podStartE2EDuration="7.486598569s" podCreationTimestamp="2026-04-16 16:50:43 +0000 UTC" firstStartedPulling="2026-04-16 16:50:44.50519566 +0000 UTC m=+181.175372431" lastFinishedPulling="2026-04-16 16:50:49.432236493 +0000 UTC m=+186.102413264" observedRunningTime="2026-04-16 16:50:50.483271246 +0000 UTC m=+187.153448025" watchObservedRunningTime="2026-04-16 16:50:50.486598569 +0000 UTC m=+187.156775347" Apr 16 16:50:52.390484 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.390419 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" podUID="a06f9194-8eb8-465c-a248-cdeff2ea3ec9" containerName="registry" containerID="cri-o://e51a7557399529693ae4a60592003c9810a423e56a4369f21a735e55a6898bc5" gracePeriod=30 Apr 16 16:50:52.619217 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.619193 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:50:52.753805 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.753775 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-image-registry-private-configuration\") pod \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " Apr 16 16:50:52.753967 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.753816 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-certificates\") pod \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " Apr 16 16:50:52.753967 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.753873 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-installation-pull-secrets\") pod \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " Apr 16 16:50:52.754042 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.754024 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-trusted-ca\") pod \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " Apr 16 16:50:52.754082 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.754062 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6t6q\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-kube-api-access-n6t6q\") pod \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " Apr 16 16:50:52.754126 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.754109 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-bound-sa-token\") pod \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " Apr 16 16:50:52.754168 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.754142 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls\") pod \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " Apr 16 16:50:52.754295 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.754271 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-ca-trust-extracted\") pod \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\" (UID: \"a06f9194-8eb8-465c-a248-cdeff2ea3ec9\") " Apr 16 16:50:52.754377 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.754287 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a06f9194-8eb8-465c-a248-cdeff2ea3ec9" (UID: "a06f9194-8eb8-465c-a248-cdeff2ea3ec9"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:50:52.754465 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.754441 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a06f9194-8eb8-465c-a248-cdeff2ea3ec9" (UID: "a06f9194-8eb8-465c-a248-cdeff2ea3ec9"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:50:52.754773 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.754614 2568 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-certificates\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:50:52.754773 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.754745 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-trusted-ca\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:50:52.756436 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.756407 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "a06f9194-8eb8-465c-a248-cdeff2ea3ec9" (UID: "a06f9194-8eb8-465c-a248-cdeff2ea3ec9"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:52.756621 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.756590 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a06f9194-8eb8-465c-a248-cdeff2ea3ec9" (UID: "a06f9194-8eb8-465c-a248-cdeff2ea3ec9"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:50:52.756742 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.756687 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a06f9194-8eb8-465c-a248-cdeff2ea3ec9" (UID: "a06f9194-8eb8-465c-a248-cdeff2ea3ec9"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:50:52.756742 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.756694 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a06f9194-8eb8-465c-a248-cdeff2ea3ec9" (UID: "a06f9194-8eb8-465c-a248-cdeff2ea3ec9"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:50:52.756811 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.756751 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-kube-api-access-n6t6q" (OuterVolumeSpecName: "kube-api-access-n6t6q") pod "a06f9194-8eb8-465c-a248-cdeff2ea3ec9" (UID: "a06f9194-8eb8-465c-a248-cdeff2ea3ec9"). InnerVolumeSpecName "kube-api-access-n6t6q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:50:52.762578 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.762556 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a06f9194-8eb8-465c-a248-cdeff2ea3ec9" (UID: "a06f9194-8eb8-465c-a248-cdeff2ea3ec9"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:50:52.855586 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.855549 2568 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-installation-pull-secrets\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:50:52.855586 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.855580 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n6t6q\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-kube-api-access-n6t6q\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:50:52.855586 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.855592 2568 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-bound-sa-token\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:50:52.855800 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.855603 2568 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-registry-tls\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:50:52.855800 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.855612 2568 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-ca-trust-extracted\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:50:52.855800 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:52.855621 2568 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a06f9194-8eb8-465c-a248-cdeff2ea3ec9-image-registry-private-configuration\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:50:53.466910 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:53.466877 2568 generic.go:358] "Generic (PLEG): container finished" podID="a06f9194-8eb8-465c-a248-cdeff2ea3ec9" containerID="e51a7557399529693ae4a60592003c9810a423e56a4369f21a735e55a6898bc5" exitCode=0 Apr 16 16:50:53.467336 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:53.466947 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" event={"ID":"a06f9194-8eb8-465c-a248-cdeff2ea3ec9","Type":"ContainerDied","Data":"e51a7557399529693ae4a60592003c9810a423e56a4369f21a735e55a6898bc5"} Apr 16 16:50:53.467336 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:53.466979 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" event={"ID":"a06f9194-8eb8-465c-a248-cdeff2ea3ec9","Type":"ContainerDied","Data":"7650009c6017bab8b84ea4b76da4ea91df610d8ffb6ba3bfe7c32df5c15c2f5c"} Apr 16 16:50:53.467336 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:53.467000 2568 scope.go:117] "RemoveContainer" containerID="e51a7557399529693ae4a60592003c9810a423e56a4369f21a735e55a6898bc5" Apr 16 16:50:53.467336 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:53.466950 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66f6866d4-g7mhz" Apr 16 16:50:53.475717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:53.475576 2568 scope.go:117] "RemoveContainer" containerID="e51a7557399529693ae4a60592003c9810a423e56a4369f21a735e55a6898bc5" Apr 16 16:50:53.475897 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:50:53.475877 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e51a7557399529693ae4a60592003c9810a423e56a4369f21a735e55a6898bc5\": container with ID starting with e51a7557399529693ae4a60592003c9810a423e56a4369f21a735e55a6898bc5 not found: ID does not exist" containerID="e51a7557399529693ae4a60592003c9810a423e56a4369f21a735e55a6898bc5" Apr 16 16:50:53.475963 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:53.475909 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e51a7557399529693ae4a60592003c9810a423e56a4369f21a735e55a6898bc5"} err="failed to get container status \"e51a7557399529693ae4a60592003c9810a423e56a4369f21a735e55a6898bc5\": rpc error: code = NotFound desc = could not find container \"e51a7557399529693ae4a60592003c9810a423e56a4369f21a735e55a6898bc5\": container with ID starting with e51a7557399529693ae4a60592003c9810a423e56a4369f21a735e55a6898bc5 not found: ID does not exist" Apr 16 16:50:53.489473 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:53.489449 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66f6866d4-g7mhz"] Apr 16 16:50:53.493130 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:53.493105 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-66f6866d4-g7mhz"] Apr 16 16:50:53.933184 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:53.933153 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a06f9194-8eb8-465c-a248-cdeff2ea3ec9" path="/var/lib/kubelet/pods/a06f9194-8eb8-465c-a248-cdeff2ea3ec9/volumes" Apr 16 16:50:54.088151 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:50:54.088103 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:51:07.348326 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:51:07.348289 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4nrrn_af4910d1-f39b-44e2-805e-bcc17c0e30d0/serve-healthcheck-canary/0.log" Apr 16 16:51:07.508376 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:51:07.508343 2568 generic.go:358] "Generic (PLEG): container finished" podID="a7ed2b8a-4851-4792-868d-23a18751df58" containerID="daa5ab1cced67ab811b46ffd8aab939bbe48a31277d4f8f97837d287f465e941" exitCode=0 Apr 16 16:51:07.508538 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:51:07.508410 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p8zc8" event={"ID":"a7ed2b8a-4851-4792-868d-23a18751df58","Type":"ContainerDied","Data":"daa5ab1cced67ab811b46ffd8aab939bbe48a31277d4f8f97837d287f465e941"} Apr 16 16:51:07.508737 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:51:07.508724 2568 scope.go:117] "RemoveContainer" containerID="daa5ab1cced67ab811b46ffd8aab939bbe48a31277d4f8f97837d287f465e941" Apr 16 16:51:07.922456 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:51:07.922427 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-667775844f-zn9sv_5d7eab9c-0611-4981-95ed-6811b8ca42d6/cluster-samples-operator/0.log" Apr 16 16:51:08.121186 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:51:08.121158 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-667775844f-zn9sv_5d7eab9c-0611-4981-95ed-6811b8ca42d6/cluster-samples-operator-watch/0.log" Apr 16 16:51:08.514712 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:51:08.514673 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-p8zc8" event={"ID":"a7ed2b8a-4851-4792-868d-23a18751df58","Type":"ContainerStarted","Data":"33b152a27ae9d4108ba8393cdaa0a2c4134aa842203b58e8560d9014bebcbfac"} Apr 16 16:51:44.088273 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:51:44.088171 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:51:44.107023 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:51:44.106992 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:51:44.623460 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:51:44.623433 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:51:55.755482 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:51:55.755426 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs\") pod \"network-metrics-daemon-5rwjz\" (UID: \"65f280f9-caf6-429e-ac03-31bd647a05b6\") " pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:51:55.757741 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:51:55.757711 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/65f280f9-caf6-429e-ac03-31bd647a05b6-metrics-certs\") pod \"network-metrics-daemon-5rwjz\" (UID: \"65f280f9-caf6-429e-ac03-31bd647a05b6\") " pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:51:55.834661 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:51:55.834630 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-fz45n\"" Apr 16 16:51:55.842898 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:51:55.842867 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5rwjz" Apr 16 16:51:55.958970 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:51:55.958945 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5rwjz"] Apr 16 16:51:55.961233 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:51:55.961206 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65f280f9_caf6_429e_ac03_31bd647a05b6.slice/crio-83aa10ee789a02564a9bd6d248bbdf67dbca63aabcff8b4dbf8e8fed327b2580 WatchSource:0}: Error finding container 83aa10ee789a02564a9bd6d248bbdf67dbca63aabcff8b4dbf8e8fed327b2580: Status 404 returned error can't find the container with id 83aa10ee789a02564a9bd6d248bbdf67dbca63aabcff8b4dbf8e8fed327b2580 Apr 16 16:51:56.641909 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:51:56.641867 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5rwjz" event={"ID":"65f280f9-caf6-429e-ac03-31bd647a05b6","Type":"ContainerStarted","Data":"83aa10ee789a02564a9bd6d248bbdf67dbca63aabcff8b4dbf8e8fed327b2580"} Apr 16 16:51:57.650447 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:51:57.650407 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5rwjz" event={"ID":"65f280f9-caf6-429e-ac03-31bd647a05b6","Type":"ContainerStarted","Data":"5bbdd18f5d6ed42ec708b2dd8d5f7c78581458b452a48722897daa066624f03a"} Apr 16 16:51:57.650447 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:51:57.650450 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5rwjz" event={"ID":"65f280f9-caf6-429e-ac03-31bd647a05b6","Type":"ContainerStarted","Data":"4cb0750bf6ba4f02fbf5074d7255173c039be8697dae75ffcec72aee4ebe2214"} Apr 16 16:51:57.666627 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:51:57.666578 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5rwjz" podStartSLOduration=252.689455184 podStartE2EDuration="4m13.66656404s" podCreationTimestamp="2026-04-16 16:47:44 +0000 UTC" firstStartedPulling="2026-04-16 16:51:55.962963164 +0000 UTC m=+252.633139926" lastFinishedPulling="2026-04-16 16:51:56.940072015 +0000 UTC m=+253.610248782" observedRunningTime="2026-04-16 16:51:57.664492078 +0000 UTC m=+254.334668857" watchObservedRunningTime="2026-04-16 16:51:57.66656404 +0000 UTC m=+254.336740818" Apr 16 16:52:02.142381 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.142348 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:52:02.142824 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.142788 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="prometheus" containerID="cri-o://48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac" gracePeriod=600 Apr 16 16:52:02.142984 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.142819 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="kube-rbac-proxy" containerID="cri-o://6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4" gracePeriod=600 Apr 16 16:52:02.142984 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.142834 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="thanos-sidecar" containerID="cri-o://afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c" gracePeriod=600 Apr 16 16:52:02.142984 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.142857 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="kube-rbac-proxy-web" containerID="cri-o://9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d" gracePeriod=600 Apr 16 16:52:02.142984 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.142888 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="kube-rbac-proxy-thanos" containerID="cri-o://4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817" gracePeriod=600 Apr 16 16:52:02.142984 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.142932 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="config-reloader" containerID="cri-o://73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a" gracePeriod=600 Apr 16 16:52:02.383748 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.383723 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.507560 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.507509 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-prometheus-k8s-tls\") pod \"12495802-0b96-4046-a489-89800a54f412\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " Apr 16 16:52:02.507708 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.507575 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-configmap-kubelet-serving-ca-bundle\") pod \"12495802-0b96-4046-a489-89800a54f412\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " Apr 16 16:52:02.507708 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.507594 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/12495802-0b96-4046-a489-89800a54f412-tls-assets\") pod \"12495802-0b96-4046-a489-89800a54f412\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " Apr 16 16:52:02.507708 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.507615 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-thanos-prometheus-http-client-file\") pod \"12495802-0b96-4046-a489-89800a54f412\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " Apr 16 16:52:02.507708 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.507639 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"12495802-0b96-4046-a489-89800a54f412\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " Apr 16 16:52:02.507708 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.507666 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-prometheus-trusted-ca-bundle\") pod \"12495802-0b96-4046-a489-89800a54f412\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " Apr 16 16:52:02.507708 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.507692 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-config\") pod \"12495802-0b96-4046-a489-89800a54f412\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " Apr 16 16:52:02.508006 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.507729 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-configmap-serving-certs-ca-bundle\") pod \"12495802-0b96-4046-a489-89800a54f412\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " Apr 16 16:52:02.508006 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.507773 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/12495802-0b96-4046-a489-89800a54f412-config-out\") pod \"12495802-0b96-4046-a489-89800a54f412\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " Apr 16 16:52:02.508006 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.507802 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-metrics-client-certs\") pod \"12495802-0b96-4046-a489-89800a54f412\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " Apr 16 16:52:02.508006 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.507831 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-kube-rbac-proxy\") pod \"12495802-0b96-4046-a489-89800a54f412\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " Apr 16 16:52:02.508006 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.507865 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"12495802-0b96-4046-a489-89800a54f412\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " Apr 16 16:52:02.508006 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.507888 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-web-config\") pod \"12495802-0b96-4046-a489-89800a54f412\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " Apr 16 16:52:02.508006 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.507917 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-configmap-metrics-client-ca\") pod \"12495802-0b96-4046-a489-89800a54f412\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " Apr 16 16:52:02.508006 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.507962 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/12495802-0b96-4046-a489-89800a54f412-prometheus-k8s-db\") pod \"12495802-0b96-4046-a489-89800a54f412\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " Apr 16 16:52:02.508006 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.507986 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-prometheus-k8s-rulefiles-0\") pod \"12495802-0b96-4046-a489-89800a54f412\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " Apr 16 16:52:02.508447 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.508028 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-grpc-tls\") pod \"12495802-0b96-4046-a489-89800a54f412\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " Apr 16 16:52:02.508447 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.508055 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn8wj\" (UniqueName: \"kubernetes.io/projected/12495802-0b96-4046-a489-89800a54f412-kube-api-access-sn8wj\") pod \"12495802-0b96-4046-a489-89800a54f412\" (UID: \"12495802-0b96-4046-a489-89800a54f412\") " Apr 16 16:52:02.508447 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.508070 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "12495802-0b96-4046-a489-89800a54f412" (UID: "12495802-0b96-4046-a489-89800a54f412"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:52:02.508447 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.508278 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:52:02.508447 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.508336 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "12495802-0b96-4046-a489-89800a54f412" (UID: "12495802-0b96-4046-a489-89800a54f412"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:52:02.509983 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.509249 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "12495802-0b96-4046-a489-89800a54f412" (UID: "12495802-0b96-4046-a489-89800a54f412"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:52:02.509983 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.509725 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "12495802-0b96-4046-a489-89800a54f412" (UID: "12495802-0b96-4046-a489-89800a54f412"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:52:02.509983 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.509938 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12495802-0b96-4046-a489-89800a54f412-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "12495802-0b96-4046-a489-89800a54f412" (UID: "12495802-0b96-4046-a489-89800a54f412"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:52:02.510588 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.510557 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "12495802-0b96-4046-a489-89800a54f412" (UID: "12495802-0b96-4046-a489-89800a54f412"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:52:02.511121 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.510806 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "12495802-0b96-4046-a489-89800a54f412" (UID: "12495802-0b96-4046-a489-89800a54f412"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:52:02.511121 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.510882 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "12495802-0b96-4046-a489-89800a54f412" (UID: "12495802-0b96-4046-a489-89800a54f412"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:52:02.511121 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.510965 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "12495802-0b96-4046-a489-89800a54f412" (UID: "12495802-0b96-4046-a489-89800a54f412"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:52:02.511121 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.511031 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12495802-0b96-4046-a489-89800a54f412-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "12495802-0b96-4046-a489-89800a54f412" (UID: "12495802-0b96-4046-a489-89800a54f412"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:52:02.511121 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.511089 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12495802-0b96-4046-a489-89800a54f412-kube-api-access-sn8wj" (OuterVolumeSpecName: "kube-api-access-sn8wj") pod "12495802-0b96-4046-a489-89800a54f412" (UID: "12495802-0b96-4046-a489-89800a54f412"). InnerVolumeSpecName "kube-api-access-sn8wj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:52:02.512507 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.512482 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-config" (OuterVolumeSpecName: "config") pod "12495802-0b96-4046-a489-89800a54f412" (UID: "12495802-0b96-4046-a489-89800a54f412"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:52:02.512633 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.512614 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "12495802-0b96-4046-a489-89800a54f412" (UID: "12495802-0b96-4046-a489-89800a54f412"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:52:02.512698 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.512620 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12495802-0b96-4046-a489-89800a54f412-config-out" (OuterVolumeSpecName: "config-out") pod "12495802-0b96-4046-a489-89800a54f412" (UID: "12495802-0b96-4046-a489-89800a54f412"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:52:02.512698 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.512641 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "12495802-0b96-4046-a489-89800a54f412" (UID: "12495802-0b96-4046-a489-89800a54f412"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:52:02.512698 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.512677 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "12495802-0b96-4046-a489-89800a54f412" (UID: "12495802-0b96-4046-a489-89800a54f412"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:52:02.512813 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.512721 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "12495802-0b96-4046-a489-89800a54f412" (UID: "12495802-0b96-4046-a489-89800a54f412"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:52:02.521368 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.521346 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-web-config" (OuterVolumeSpecName: "web-config") pod "12495802-0b96-4046-a489-89800a54f412" (UID: "12495802-0b96-4046-a489-89800a54f412"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:52:02.608909 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.608857 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:52:02.608909 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.608905 2568 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-web-config\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:52:02.608909 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.608920 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-configmap-metrics-client-ca\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:52:02.609169 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.608933 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/12495802-0b96-4046-a489-89800a54f412-prometheus-k8s-db\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:52:02.609169 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.608945 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:52:02.609169 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.608959 2568 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-grpc-tls\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:52:02.609169 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.608972 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sn8wj\" (UniqueName: \"kubernetes.io/projected/12495802-0b96-4046-a489-89800a54f412-kube-api-access-sn8wj\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:52:02.609169 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.608984 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-prometheus-k8s-tls\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:52:02.609169 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.608996 2568 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/12495802-0b96-4046-a489-89800a54f412-tls-assets\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:52:02.609169 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.609008 2568 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-thanos-prometheus-http-client-file\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:52:02.609169 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.609021 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:52:02.609169 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.609034 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-prometheus-trusted-ca-bundle\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:52:02.609169 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.609046 2568 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-config\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:52:02.609169 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.609060 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12495802-0b96-4046-a489-89800a54f412-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:52:02.609169 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.609071 2568 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/12495802-0b96-4046-a489-89800a54f412-config-out\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:52:02.609169 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.609085 2568 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-metrics-client-certs\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:52:02.609169 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.609099 2568 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/12495802-0b96-4046-a489-89800a54f412-secret-kube-rbac-proxy\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:52:02.669800 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.669764 2568 generic.go:358] "Generic (PLEG): container finished" podID="12495802-0b96-4046-a489-89800a54f412" containerID="4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817" exitCode=0 Apr 16 16:52:02.669800 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.669790 2568 generic.go:358] "Generic (PLEG): container finished" podID="12495802-0b96-4046-a489-89800a54f412" containerID="6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4" exitCode=0 Apr 16 16:52:02.669800 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.669796 2568 generic.go:358] "Generic (PLEG): container finished" podID="12495802-0b96-4046-a489-89800a54f412" containerID="9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d" exitCode=0 Apr 16 16:52:02.669800 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.669802 2568 generic.go:358] "Generic (PLEG): container finished" podID="12495802-0b96-4046-a489-89800a54f412" containerID="afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c" exitCode=0 Apr 16 16:52:02.669800 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.669807 2568 generic.go:358] "Generic (PLEG): container finished" podID="12495802-0b96-4046-a489-89800a54f412" containerID="73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a" exitCode=0 Apr 16 16:52:02.669800 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.669812 2568 generic.go:358] "Generic (PLEG): container finished" podID="12495802-0b96-4046-a489-89800a54f412" containerID="48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac" exitCode=0 Apr 16 16:52:02.670133 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.669841 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12495802-0b96-4046-a489-89800a54f412","Type":"ContainerDied","Data":"4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817"} Apr 16 16:52:02.670133 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.669880 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12495802-0b96-4046-a489-89800a54f412","Type":"ContainerDied","Data":"6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4"} Apr 16 16:52:02.670133 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.669890 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12495802-0b96-4046-a489-89800a54f412","Type":"ContainerDied","Data":"9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d"} Apr 16 16:52:02.670133 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.669901 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12495802-0b96-4046-a489-89800a54f412","Type":"ContainerDied","Data":"afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c"} Apr 16 16:52:02.670133 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.669912 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12495802-0b96-4046-a489-89800a54f412","Type":"ContainerDied","Data":"73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a"} Apr 16 16:52:02.670133 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.669920 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12495802-0b96-4046-a489-89800a54f412","Type":"ContainerDied","Data":"48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac"} Apr 16 16:52:02.670133 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.669930 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"12495802-0b96-4046-a489-89800a54f412","Type":"ContainerDied","Data":"2f50bf219298fceb58db46bd4e15d35973e151110fb76bd296d4287cafc0d8bb"} Apr 16 16:52:02.670133 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.669889 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.670133 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.669956 2568 scope.go:117] "RemoveContainer" containerID="4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817" Apr 16 16:52:02.677416 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.677287 2568 scope.go:117] "RemoveContainer" containerID="6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4" Apr 16 16:52:02.683707 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.683689 2568 scope.go:117] "RemoveContainer" containerID="9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d" Apr 16 16:52:02.689666 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.689650 2568 scope.go:117] "RemoveContainer" containerID="afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c" Apr 16 16:52:02.692723 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.692699 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:52:02.696177 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.696161 2568 scope.go:117] "RemoveContainer" containerID="73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a" Apr 16 16:52:02.701389 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.700371 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:52:02.707860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.707834 2568 scope.go:117] "RemoveContainer" containerID="48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac" Apr 16 16:52:02.714497 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.714479 2568 scope.go:117] "RemoveContainer" containerID="ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490" Apr 16 16:52:02.720348 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.720333 2568 scope.go:117] "RemoveContainer" containerID="4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817" Apr 16 16:52:02.720586 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:52:02.720565 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817\": container with ID starting with 4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817 not found: ID does not exist" containerID="4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817" Apr 16 16:52:02.720647 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.720594 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817"} err="failed to get container status \"4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817\": rpc error: code = NotFound desc = could not find container \"4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817\": container with ID starting with 4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817 not found: ID does not exist" Apr 16 16:52:02.720647 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.720612 2568 scope.go:117] "RemoveContainer" containerID="6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4" Apr 16 16:52:02.720828 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:52:02.720809 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4\": container with ID starting with 6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4 not found: ID does not exist" containerID="6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4" Apr 16 16:52:02.720870 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.720835 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4"} err="failed to get container status \"6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4\": rpc error: code = NotFound desc = could not find container \"6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4\": container with ID starting with 6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4 not found: ID does not exist" Apr 16 16:52:02.720907 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.720872 2568 scope.go:117] "RemoveContainer" containerID="9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d" Apr 16 16:52:02.721069 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:52:02.721055 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d\": container with ID starting with 9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d not found: ID does not exist" containerID="9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d" Apr 16 16:52:02.721106 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.721073 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d"} err="failed to get container status \"9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d\": rpc error: code = NotFound desc = could not find container \"9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d\": container with ID starting with 9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d not found: ID does not exist" Apr 16 16:52:02.721106 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.721086 2568 scope.go:117] "RemoveContainer" containerID="afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c" Apr 16 16:52:02.721302 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:52:02.721286 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c\": container with ID starting with afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c not found: ID does not exist" containerID="afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c" Apr 16 16:52:02.721352 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.721305 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c"} err="failed to get container status \"afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c\": rpc error: code = NotFound desc = could not find container \"afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c\": container with ID starting with afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c not found: ID does not exist" Apr 16 16:52:02.721352 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.721316 2568 scope.go:117] "RemoveContainer" containerID="73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a" Apr 16 16:52:02.721494 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:52:02.721479 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a\": container with ID starting with 73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a not found: ID does not exist" containerID="73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a" Apr 16 16:52:02.721553 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.721496 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a"} err="failed to get container status \"73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a\": rpc error: code = NotFound desc = could not find container \"73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a\": container with ID starting with 73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a not found: ID does not exist" Apr 16 16:52:02.721553 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.721507 2568 scope.go:117] "RemoveContainer" containerID="48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac" Apr 16 16:52:02.721842 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:52:02.721816 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac\": container with ID starting with 48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac not found: ID does not exist" containerID="48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac" Apr 16 16:52:02.721937 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.721849 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac"} err="failed to get container status \"48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac\": rpc error: code = NotFound desc = could not find container \"48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac\": container with ID starting with 48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac not found: ID does not exist" Apr 16 16:52:02.721937 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.721867 2568 scope.go:117] "RemoveContainer" containerID="ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490" Apr 16 16:52:02.722143 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:52:02.722121 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490\": container with ID starting with ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490 not found: ID does not exist" containerID="ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490" Apr 16 16:52:02.722197 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.722148 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490"} err="failed to get container status \"ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490\": rpc error: code = NotFound desc = could not find container \"ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490\": container with ID starting with ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490 not found: ID does not exist" Apr 16 16:52:02.722197 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.722164 2568 scope.go:117] "RemoveContainer" containerID="4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817" Apr 16 16:52:02.722631 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.722607 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817"} err="failed to get container status \"4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817\": rpc error: code = NotFound desc = could not find container \"4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817\": container with ID starting with 4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817 not found: ID does not exist" Apr 16 16:52:02.722710 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.722633 2568 scope.go:117] "RemoveContainer" containerID="6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4" Apr 16 16:52:02.722919 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.722882 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4"} err="failed to get container status \"6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4\": rpc error: code = NotFound desc = could not find container \"6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4\": container with ID starting with 6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4 not found: ID does not exist" Apr 16 16:52:02.722919 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.722919 2568 scope.go:117] "RemoveContainer" containerID="9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d" Apr 16 16:52:02.723195 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.723170 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d"} err="failed to get container status \"9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d\": rpc error: code = NotFound desc = could not find container \"9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d\": container with ID starting with 9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d not found: ID does not exist" Apr 16 16:52:02.723261 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.723197 2568 scope.go:117] "RemoveContainer" containerID="afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c" Apr 16 16:52:02.723454 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.723433 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c"} err="failed to get container status \"afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c\": rpc error: code = NotFound desc = could not find container \"afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c\": container with ID starting with afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c not found: ID does not exist" Apr 16 16:52:02.723553 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.723456 2568 scope.go:117] "RemoveContainer" containerID="73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a" Apr 16 16:52:02.723684 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.723665 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a"} err="failed to get container status \"73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a\": rpc error: code = NotFound desc = could not find container \"73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a\": container with ID starting with 73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a not found: ID does not exist" Apr 16 16:52:02.723736 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.723685 2568 scope.go:117] "RemoveContainer" containerID="48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac" Apr 16 16:52:02.723886 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.723864 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac"} err="failed to get container status \"48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac\": rpc error: code = NotFound desc = could not find container \"48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac\": container with ID starting with 48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac not found: ID does not exist" Apr 16 16:52:02.723944 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.723886 2568 scope.go:117] "RemoveContainer" containerID="ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490" Apr 16 16:52:02.724143 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724124 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490"} err="failed to get container status \"ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490\": rpc error: code = NotFound desc = could not find container \"ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490\": container with ID starting with ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490 not found: ID does not exist" Apr 16 16:52:02.724194 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724145 2568 scope.go:117] "RemoveContainer" containerID="4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817" Apr 16 16:52:02.724194 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724179 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:52:02.724333 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724314 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817"} err="failed to get container status \"4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817\": rpc error: code = NotFound desc = could not find container \"4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817\": container with ID starting with 4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817 not found: ID does not exist" Apr 16 16:52:02.724412 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724336 2568 scope.go:117] "RemoveContainer" containerID="6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4" Apr 16 16:52:02.724489 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724471 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="init-config-reloader" Apr 16 16:52:02.724549 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724494 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="init-config-reloader" Apr 16 16:52:02.724549 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724507 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="kube-rbac-proxy-thanos" Apr 16 16:52:02.724549 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724529 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="kube-rbac-proxy-thanos" Apr 16 16:52:02.724549 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724543 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="config-reloader" Apr 16 16:52:02.724549 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724549 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="config-reloader" Apr 16 16:52:02.724717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724555 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="thanos-sidecar" Apr 16 16:52:02.724717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724562 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="thanos-sidecar" Apr 16 16:52:02.724717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724580 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a06f9194-8eb8-465c-a248-cdeff2ea3ec9" containerName="registry" Apr 16 16:52:02.724717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724589 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a06f9194-8eb8-465c-a248-cdeff2ea3ec9" containerName="registry" Apr 16 16:52:02.724717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724606 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="prometheus" Apr 16 16:52:02.724717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724612 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="prometheus" Apr 16 16:52:02.724717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724618 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="kube-rbac-proxy-web" Apr 16 16:52:02.724717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724623 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="kube-rbac-proxy-web" Apr 16 16:52:02.724717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724630 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="kube-rbac-proxy" Apr 16 16:52:02.724717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724635 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="kube-rbac-proxy" Apr 16 16:52:02.724717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724555 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4"} err="failed to get container status \"6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4\": rpc error: code = NotFound desc = could not find container \"6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4\": container with ID starting with 6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4 not found: ID does not exist" Apr 16 16:52:02.724717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724655 2568 scope.go:117] "RemoveContainer" containerID="9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d" Apr 16 16:52:02.724717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724682 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="thanos-sidecar" Apr 16 16:52:02.724717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724694 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="a06f9194-8eb8-465c-a248-cdeff2ea3ec9" containerName="registry" Apr 16 16:52:02.724717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724703 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="prometheus" Apr 16 16:52:02.724717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724709 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="kube-rbac-proxy-web" Apr 16 16:52:02.724717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724716 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="config-reloader" Apr 16 16:52:02.724717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724722 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="kube-rbac-proxy" Apr 16 16:52:02.724717 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724728 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="12495802-0b96-4046-a489-89800a54f412" containerName="kube-rbac-proxy-thanos" Apr 16 16:52:02.725358 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724859 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d"} err="failed to get container status \"9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d\": rpc error: code = NotFound desc = could not find container \"9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d\": container with ID starting with 9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d not found: ID does not exist" Apr 16 16:52:02.725358 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.724876 2568 scope.go:117] "RemoveContainer" containerID="afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c" Apr 16 16:52:02.725358 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.725089 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c"} err="failed to get container status \"afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c\": rpc error: code = NotFound desc = could not find container \"afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c\": container with ID starting with afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c not found: ID does not exist" Apr 16 16:52:02.725358 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.725104 2568 scope.go:117] "RemoveContainer" containerID="73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a" Apr 16 16:52:02.725358 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.725308 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a"} err="failed to get container status \"73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a\": rpc error: code = NotFound desc = could not find container \"73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a\": container with ID starting with 73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a not found: ID does not exist" Apr 16 16:52:02.725358 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.725321 2568 scope.go:117] "RemoveContainer" containerID="48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac" Apr 16 16:52:02.725614 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.725494 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac"} err="failed to get container status \"48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac\": rpc error: code = NotFound desc = could not find container \"48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac\": container with ID starting with 48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac not found: ID does not exist" Apr 16 16:52:02.725614 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.725510 2568 scope.go:117] "RemoveContainer" containerID="ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490" Apr 16 16:52:02.725788 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.725772 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490"} err="failed to get container status \"ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490\": rpc error: code = NotFound desc = could not find container \"ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490\": container with ID starting with ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490 not found: ID does not exist" Apr 16 16:52:02.725833 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.725788 2568 scope.go:117] "RemoveContainer" containerID="4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817" Apr 16 16:52:02.725983 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.725968 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817"} err="failed to get container status \"4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817\": rpc error: code = NotFound desc = could not find container \"4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817\": container with ID starting with 4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817 not found: ID does not exist" Apr 16 16:52:02.726023 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.725983 2568 scope.go:117] "RemoveContainer" containerID="6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4" Apr 16 16:52:02.726157 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.726141 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4"} err="failed to get container status \"6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4\": rpc error: code = NotFound desc = could not find container \"6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4\": container with ID starting with 6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4 not found: ID does not exist" Apr 16 16:52:02.726196 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.726156 2568 scope.go:117] "RemoveContainer" containerID="9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d" Apr 16 16:52:02.726341 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.726326 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d"} err="failed to get container status \"9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d\": rpc error: code = NotFound desc = could not find container \"9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d\": container with ID starting with 9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d not found: ID does not exist" Apr 16 16:52:02.726392 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.726341 2568 scope.go:117] "RemoveContainer" containerID="afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c" Apr 16 16:52:02.726573 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.726553 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c"} err="failed to get container status \"afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c\": rpc error: code = NotFound desc = could not find container \"afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c\": container with ID starting with afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c not found: ID does not exist" Apr 16 16:52:02.726631 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.726574 2568 scope.go:117] "RemoveContainer" containerID="73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a" Apr 16 16:52:02.726792 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.726774 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a"} err="failed to get container status \"73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a\": rpc error: code = NotFound desc = could not find container \"73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a\": container with ID starting with 73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a not found: ID does not exist" Apr 16 16:52:02.726833 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.726793 2568 scope.go:117] "RemoveContainer" containerID="48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac" Apr 16 16:52:02.726989 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.726971 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac"} err="failed to get container status \"48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac\": rpc error: code = NotFound desc = could not find container \"48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac\": container with ID starting with 48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac not found: ID does not exist" Apr 16 16:52:02.726989 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.726988 2568 scope.go:117] "RemoveContainer" containerID="ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490" Apr 16 16:52:02.727169 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.727152 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490"} err="failed to get container status \"ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490\": rpc error: code = NotFound desc = could not find container \"ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490\": container with ID starting with ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490 not found: ID does not exist" Apr 16 16:52:02.727169 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.727170 2568 scope.go:117] "RemoveContainer" containerID="4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817" Apr 16 16:52:02.727363 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.727345 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817"} err="failed to get container status \"4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817\": rpc error: code = NotFound desc = could not find container \"4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817\": container with ID starting with 4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817 not found: ID does not exist" Apr 16 16:52:02.727408 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.727364 2568 scope.go:117] "RemoveContainer" containerID="6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4" Apr 16 16:52:02.727603 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.727586 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4"} err="failed to get container status \"6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4\": rpc error: code = NotFound desc = could not find container \"6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4\": container with ID starting with 6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4 not found: ID does not exist" Apr 16 16:52:02.727650 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.727604 2568 scope.go:117] "RemoveContainer" containerID="9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d" Apr 16 16:52:02.727791 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.727776 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d"} err="failed to get container status \"9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d\": rpc error: code = NotFound desc = could not find container \"9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d\": container with ID starting with 9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d not found: ID does not exist" Apr 16 16:52:02.727827 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.727791 2568 scope.go:117] "RemoveContainer" containerID="afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c" Apr 16 16:52:02.728012 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.727997 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c"} err="failed to get container status \"afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c\": rpc error: code = NotFound desc = could not find container \"afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c\": container with ID starting with afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c not found: ID does not exist" Apr 16 16:52:02.728012 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.728011 2568 scope.go:117] "RemoveContainer" containerID="73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a" Apr 16 16:52:02.728206 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.728191 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a"} err="failed to get container status \"73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a\": rpc error: code = NotFound desc = could not find container \"73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a\": container with ID starting with 73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a not found: ID does not exist" Apr 16 16:52:02.728258 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.728206 2568 scope.go:117] "RemoveContainer" containerID="48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac" Apr 16 16:52:02.728391 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.728376 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac"} err="failed to get container status \"48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac\": rpc error: code = NotFound desc = could not find container \"48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac\": container with ID starting with 48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac not found: ID does not exist" Apr 16 16:52:02.728431 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.728390 2568 scope.go:117] "RemoveContainer" containerID="ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490" Apr 16 16:52:02.728580 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.728565 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.728627 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.728597 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490"} err="failed to get container status \"ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490\": rpc error: code = NotFound desc = could not find container \"ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490\": container with ID starting with ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490 not found: ID does not exist" Apr 16 16:52:02.728627 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.728609 2568 scope.go:117] "RemoveContainer" containerID="4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817" Apr 16 16:52:02.728807 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.728781 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817"} err="failed to get container status \"4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817\": rpc error: code = NotFound desc = could not find container \"4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817\": container with ID starting with 4e32e7b5262092bf4cd27be937db66d7ae3dc3a909c35a2e0600817a2800d817 not found: ID does not exist" Apr 16 16:52:02.728869 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.728808 2568 scope.go:117] "RemoveContainer" containerID="6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4" Apr 16 16:52:02.729048 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.729023 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4"} err="failed to get container status \"6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4\": rpc error: code = NotFound desc = could not find container \"6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4\": container with ID starting with 6c2ecb432f00930a3145d4844c8d17ce901187265daa963520639b4271d344d4 not found: ID does not exist" Apr 16 16:52:02.729048 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.729047 2568 scope.go:117] "RemoveContainer" containerID="9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d" Apr 16 16:52:02.729289 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.729267 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d"} err="failed to get container status \"9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d\": rpc error: code = NotFound desc = could not find container \"9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d\": container with ID starting with 9fbaba2c174834cf237f54f94137d024b156c3e340b2487ecaedc14adf7f310d not found: ID does not exist" Apr 16 16:52:02.729338 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.729291 2568 scope.go:117] "RemoveContainer" containerID="afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c" Apr 16 16:52:02.729558 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.729512 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c"} err="failed to get container status \"afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c\": rpc error: code = NotFound desc = could not find container \"afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c\": container with ID starting with afd78565aec3cb88a5c7c2f2e97d374a854f6129590b35ca93020b5a727d046c not found: ID does not exist" Apr 16 16:52:02.729652 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.729558 2568 scope.go:117] "RemoveContainer" containerID="73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a" Apr 16 16:52:02.729791 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.729773 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a"} err="failed to get container status \"73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a\": rpc error: code = NotFound desc = could not find container \"73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a\": container with ID starting with 73afbdc518f846e4621a6fbae719ef3b9f416d3198e2b0b2d09d913c82bb091a not found: ID does not exist" Apr 16 16:52:02.729863 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.729793 2568 scope.go:117] "RemoveContainer" containerID="48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac" Apr 16 16:52:02.730023 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.730006 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac"} err="failed to get container status \"48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac\": rpc error: code = NotFound desc = could not find container \"48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac\": container with ID starting with 48eda8cbb03cd3f29f8ba3b765e6b5351d02a3a6401cb258e378d9080cc708ac not found: ID does not exist" Apr 16 16:52:02.730068 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.730024 2568 scope.go:117] "RemoveContainer" containerID="ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490" Apr 16 16:52:02.730226 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.730207 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490"} err="failed to get container status \"ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490\": rpc error: code = NotFound desc = could not find container \"ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490\": container with ID starting with ff4b94ff8fc063f55b1f18dbefa7bf9279b10a3fa186cf3d66a0e0820a78a490 not found: ID does not exist" Apr 16 16:52:02.731745 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.731730 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 16:52:02.732050 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.732037 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 16:52:02.732315 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.732288 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 16:52:02.732406 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.732337 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-vd5ts\"" Apr 16 16:52:02.732406 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.732380 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 16:52:02.732406 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.732396 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 16:52:02.732587 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.732562 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 16:52:02.732587 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.732568 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 16:52:02.732698 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.732586 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 16:52:02.732698 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.732564 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 16:52:02.732957 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.732939 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 16:52:02.733025 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.732993 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3eh9lmjtm83rd\"" Apr 16 16:52:02.733025 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.733004 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 16:52:02.735217 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.735199 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 16:52:02.738856 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.738837 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 16:52:02.744370 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.744341 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:52:02.810786 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.810695 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30842ca-2edc-4775-afe7-8ca58bd3f029-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.810786 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.810735 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.810786 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.810759 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.810995 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.810813 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.810995 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.810836 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.810995 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.810904 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhs7j\" (UniqueName: \"kubernetes.io/projected/c30842ca-2edc-4775-afe7-8ca58bd3f029-kube-api-access-qhs7j\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.810995 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.810932 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-web-config\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.810995 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.810955 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30842ca-2edc-4775-afe7-8ca58bd3f029-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.810995 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.810977 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c30842ca-2edc-4775-afe7-8ca58bd3f029-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.810995 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.810993 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c30842ca-2edc-4775-afe7-8ca58bd3f029-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.811196 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.811012 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c30842ca-2edc-4775-afe7-8ca58bd3f029-config-out\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.811196 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.811042 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-config\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.811196 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.811064 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.811196 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.811092 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30842ca-2edc-4775-afe7-8ca58bd3f029-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.811196 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.811122 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.811196 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.811143 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c30842ca-2edc-4775-afe7-8ca58bd3f029-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.811196 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.811165 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c30842ca-2edc-4775-afe7-8ca58bd3f029-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.811196 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.811185 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.911996 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.911949 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c30842ca-2edc-4775-afe7-8ca58bd3f029-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.911996 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.911999 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c30842ca-2edc-4775-afe7-8ca58bd3f029-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.912222 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.912028 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c30842ca-2edc-4775-afe7-8ca58bd3f029-config-out\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.912222 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.912057 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-config\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.912222 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.912080 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.912222 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.912106 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30842ca-2edc-4775-afe7-8ca58bd3f029-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.912222 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.912134 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.912222 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.912163 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c30842ca-2edc-4775-afe7-8ca58bd3f029-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.912222 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.912188 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c30842ca-2edc-4775-afe7-8ca58bd3f029-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.912570 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.912226 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.912570 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.912280 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30842ca-2edc-4775-afe7-8ca58bd3f029-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.912570 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.912305 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.912570 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.912330 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.912570 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.912367 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.912570 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.912397 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.912570 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.912428 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhs7j\" (UniqueName: \"kubernetes.io/projected/c30842ca-2edc-4775-afe7-8ca58bd3f029-kube-api-access-qhs7j\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.912570 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.912453 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-web-config\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.912570 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.912482 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30842ca-2edc-4775-afe7-8ca58bd3f029-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.912570 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.912568 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c30842ca-2edc-4775-afe7-8ca58bd3f029-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.913208 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.913181 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30842ca-2edc-4775-afe7-8ca58bd3f029-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.913276 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.913206 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30842ca-2edc-4775-afe7-8ca58bd3f029-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.913354 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.913324 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c30842ca-2edc-4775-afe7-8ca58bd3f029-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.915050 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.915018 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c30842ca-2edc-4775-afe7-8ca58bd3f029-config-out\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.915498 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.915342 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.915498 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.915385 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.915498 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.915421 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-config\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.916183 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.915817 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.916183 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.915916 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c30842ca-2edc-4775-afe7-8ca58bd3f029-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.916183 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.916099 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30842ca-2edc-4775-afe7-8ca58bd3f029-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.916183 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.916137 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-web-config\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.916183 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.916144 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.917252 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.917232 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.917572 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.917548 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.917808 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.917792 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c30842ca-2edc-4775-afe7-8ca58bd3f029-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.918362 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.918347 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c30842ca-2edc-4775-afe7-8ca58bd3f029-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:02.921192 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:02.921172 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhs7j\" (UniqueName: \"kubernetes.io/projected/c30842ca-2edc-4775-afe7-8ca58bd3f029-kube-api-access-qhs7j\") pod \"prometheus-k8s-0\" (UID: \"c30842ca-2edc-4775-afe7-8ca58bd3f029\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:03.039002 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:03.038955 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:03.162361 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:03.162330 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:52:03.166952 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:52:03.166882 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc30842ca_2edc_4775_afe7_8ca58bd3f029.slice/crio-bcba01f23ed75be87156bc465e5676b2bc9ba23a79e42a397c888f0f2b39b016 WatchSource:0}: Error finding container bcba01f23ed75be87156bc465e5676b2bc9ba23a79e42a397c888f0f2b39b016: Status 404 returned error can't find the container with id bcba01f23ed75be87156bc465e5676b2bc9ba23a79e42a397c888f0f2b39b016 Apr 16 16:52:03.674377 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:03.674340 2568 generic.go:358] "Generic (PLEG): container finished" podID="c30842ca-2edc-4775-afe7-8ca58bd3f029" containerID="06812055502ff74b094f480111c919e9a4621b314f9410d2177c37294a8088dd" exitCode=0 Apr 16 16:52:03.674565 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:03.674395 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c30842ca-2edc-4775-afe7-8ca58bd3f029","Type":"ContainerDied","Data":"06812055502ff74b094f480111c919e9a4621b314f9410d2177c37294a8088dd"} Apr 16 16:52:03.674565 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:03.674422 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c30842ca-2edc-4775-afe7-8ca58bd3f029","Type":"ContainerStarted","Data":"bcba01f23ed75be87156bc465e5676b2bc9ba23a79e42a397c888f0f2b39b016"} Apr 16 16:52:03.935039 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:03.935005 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12495802-0b96-4046-a489-89800a54f412" path="/var/lib/kubelet/pods/12495802-0b96-4046-a489-89800a54f412/volumes" Apr 16 16:52:04.680534 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:04.680475 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c30842ca-2edc-4775-afe7-8ca58bd3f029","Type":"ContainerStarted","Data":"6726a1d95f5d2a45c96b54e871317cbb7dbf92a7b6ff92218cdba99fbc467eac"} Apr 16 16:52:04.680534 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:04.680511 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c30842ca-2edc-4775-afe7-8ca58bd3f029","Type":"ContainerStarted","Data":"0a69010f21b657309667a0e3a1dca4b3a977ccd4ab92758ccf3c48f442e73310"} Apr 16 16:52:04.681065 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:04.680548 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c30842ca-2edc-4775-afe7-8ca58bd3f029","Type":"ContainerStarted","Data":"2ab57eba75e2bca4ab89a2a32eda2fd472f567d5165544d6949f28c3f89591a3"} Apr 16 16:52:04.681065 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:04.680559 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c30842ca-2edc-4775-afe7-8ca58bd3f029","Type":"ContainerStarted","Data":"a87cdccc0a26682f2a36ab718d2ae608d93e49e40e0af5a6e51e2c2999f5a6c7"} Apr 16 16:52:04.681065 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:04.680567 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c30842ca-2edc-4775-afe7-8ca58bd3f029","Type":"ContainerStarted","Data":"fe4fcd523376b4c116075066da48465568ebef5765fa6c1b2b37e72d02612971"} Apr 16 16:52:04.681065 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:04.680576 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c30842ca-2edc-4775-afe7-8ca58bd3f029","Type":"ContainerStarted","Data":"646c25847ce56bc007cccd748f3437b2a787f69b1fe99ec7ec7a11cb19143e4a"} Apr 16 16:52:04.720585 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:04.720507 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.720488203 podStartE2EDuration="2.720488203s" podCreationTimestamp="2026-04-16 16:52:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:52:04.717937158 +0000 UTC m=+261.388114001" watchObservedRunningTime="2026-04-16 16:52:04.720488203 +0000 UTC m=+261.390664983" Apr 16 16:52:08.039256 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:08.039221 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:52:43.834862 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:52:43.834839 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 16:53:03.039994 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:53:03.039956 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:53:03.055090 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:53:03.055064 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:53:03.852207 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:53:03.852180 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:55:55.494695 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:55:55.494660 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6986579898-z4zlk"] Apr 16 16:55:55.497767 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:55:55.497748 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6986579898-z4zlk" Apr 16 16:55:55.500652 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:55:55.500627 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-twzw2\"" Apr 16 16:55:55.501890 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:55:55.501845 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 16:55:55.502023 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:55:55.501898 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 16:55:55.502084 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:55:55.502023 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 16:55:55.508635 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:55:55.508611 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6986579898-z4zlk"] Apr 16 16:55:55.557371 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:55:55.557337 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6dgw\" (UniqueName: \"kubernetes.io/projected/996ee467-7cdf-4397-bb8c-fb94c379d445-kube-api-access-t6dgw\") pod \"kserve-controller-manager-6986579898-z4zlk\" (UID: \"996ee467-7cdf-4397-bb8c-fb94c379d445\") " pod="kserve/kserve-controller-manager-6986579898-z4zlk" Apr 16 16:55:55.557565 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:55:55.557452 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/996ee467-7cdf-4397-bb8c-fb94c379d445-cert\") pod \"kserve-controller-manager-6986579898-z4zlk\" (UID: \"996ee467-7cdf-4397-bb8c-fb94c379d445\") " pod="kserve/kserve-controller-manager-6986579898-z4zlk" Apr 16 16:55:55.658318 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:55:55.658277 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/996ee467-7cdf-4397-bb8c-fb94c379d445-cert\") pod \"kserve-controller-manager-6986579898-z4zlk\" (UID: \"996ee467-7cdf-4397-bb8c-fb94c379d445\") " pod="kserve/kserve-controller-manager-6986579898-z4zlk" Apr 16 16:55:55.658513 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:55:55.658335 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6dgw\" (UniqueName: \"kubernetes.io/projected/996ee467-7cdf-4397-bb8c-fb94c379d445-kube-api-access-t6dgw\") pod \"kserve-controller-manager-6986579898-z4zlk\" (UID: \"996ee467-7cdf-4397-bb8c-fb94c379d445\") " pod="kserve/kserve-controller-manager-6986579898-z4zlk" Apr 16 16:55:55.660748 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:55:55.660721 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/996ee467-7cdf-4397-bb8c-fb94c379d445-cert\") pod \"kserve-controller-manager-6986579898-z4zlk\" (UID: \"996ee467-7cdf-4397-bb8c-fb94c379d445\") " pod="kserve/kserve-controller-manager-6986579898-z4zlk" Apr 16 16:55:55.667535 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:55:55.667488 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6dgw\" (UniqueName: \"kubernetes.io/projected/996ee467-7cdf-4397-bb8c-fb94c379d445-kube-api-access-t6dgw\") pod \"kserve-controller-manager-6986579898-z4zlk\" (UID: \"996ee467-7cdf-4397-bb8c-fb94c379d445\") " pod="kserve/kserve-controller-manager-6986579898-z4zlk" Apr 16 16:55:55.809197 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:55:55.809158 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6986579898-z4zlk" Apr 16 16:55:55.924123 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:55:55.924052 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6986579898-z4zlk"] Apr 16 16:55:55.926822 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:55:55.926795 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod996ee467_7cdf_4397_bb8c_fb94c379d445.slice/crio-fea244c49e57724dd5c09dbf7d644d209178255a20b0d8ca7a325948779391df WatchSource:0}: Error finding container fea244c49e57724dd5c09dbf7d644d209178255a20b0d8ca7a325948779391df: Status 404 returned error can't find the container with id fea244c49e57724dd5c09dbf7d644d209178255a20b0d8ca7a325948779391df Apr 16 16:55:55.928156 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:55:55.928134 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:55:56.290548 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:55:56.290499 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6986579898-z4zlk" event={"ID":"996ee467-7cdf-4397-bb8c-fb94c379d445","Type":"ContainerStarted","Data":"fea244c49e57724dd5c09dbf7d644d209178255a20b0d8ca7a325948779391df"} Apr 16 16:55:59.303034 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:55:59.303000 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6986579898-z4zlk" event={"ID":"996ee467-7cdf-4397-bb8c-fb94c379d445","Type":"ContainerStarted","Data":"46518af34031744f2fb367bb563a38deeb193dead98e307c2af754994a57043c"} Apr 16 16:55:59.303426 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:55:59.303114 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6986579898-z4zlk" Apr 16 16:55:59.319844 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:55:59.319797 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6986579898-z4zlk" podStartSLOduration=1.587183134 podStartE2EDuration="4.319783448s" podCreationTimestamp="2026-04-16 16:55:55 +0000 UTC" firstStartedPulling="2026-04-16 16:55:55.928328629 +0000 UTC m=+492.598505407" lastFinishedPulling="2026-04-16 16:55:58.660928964 +0000 UTC m=+495.331105721" observedRunningTime="2026-04-16 16:55:59.318081854 +0000 UTC m=+495.988258632" watchObservedRunningTime="2026-04-16 16:55:59.319783448 +0000 UTC m=+495.989960226" Apr 16 16:56:30.310489 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:30.310459 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6986579898-z4zlk" Apr 16 16:56:31.125572 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.125533 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6986579898-z4zlk"] Apr 16 16:56:31.125836 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.125793 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-6986579898-z4zlk" podUID="996ee467-7cdf-4397-bb8c-fb94c379d445" containerName="manager" containerID="cri-o://46518af34031744f2fb367bb563a38deeb193dead98e307c2af754994a57043c" gracePeriod=10 Apr 16 16:56:31.152445 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.152415 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6986579898-v9pgs"] Apr 16 16:56:31.155233 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.155217 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6986579898-v9pgs" Apr 16 16:56:31.164309 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.164280 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6986579898-v9pgs"] Apr 16 16:56:31.239952 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.239918 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/706af7ed-b512-4c1e-852d-9cd18155209d-cert\") pod \"kserve-controller-manager-6986579898-v9pgs\" (UID: \"706af7ed-b512-4c1e-852d-9cd18155209d\") " pod="kserve/kserve-controller-manager-6986579898-v9pgs" Apr 16 16:56:31.240110 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.239965 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6knxw\" (UniqueName: \"kubernetes.io/projected/706af7ed-b512-4c1e-852d-9cd18155209d-kube-api-access-6knxw\") pod \"kserve-controller-manager-6986579898-v9pgs\" (UID: \"706af7ed-b512-4c1e-852d-9cd18155209d\") " pod="kserve/kserve-controller-manager-6986579898-v9pgs" Apr 16 16:56:31.341059 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.341025 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/706af7ed-b512-4c1e-852d-9cd18155209d-cert\") pod \"kserve-controller-manager-6986579898-v9pgs\" (UID: \"706af7ed-b512-4c1e-852d-9cd18155209d\") " pod="kserve/kserve-controller-manager-6986579898-v9pgs" Apr 16 16:56:31.341504 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.341070 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6knxw\" (UniqueName: \"kubernetes.io/projected/706af7ed-b512-4c1e-852d-9cd18155209d-kube-api-access-6knxw\") pod \"kserve-controller-manager-6986579898-v9pgs\" (UID: \"706af7ed-b512-4c1e-852d-9cd18155209d\") " pod="kserve/kserve-controller-manager-6986579898-v9pgs" Apr 16 16:56:31.343629 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.343606 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/706af7ed-b512-4c1e-852d-9cd18155209d-cert\") pod \"kserve-controller-manager-6986579898-v9pgs\" (UID: \"706af7ed-b512-4c1e-852d-9cd18155209d\") " pod="kserve/kserve-controller-manager-6986579898-v9pgs" Apr 16 16:56:31.350133 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.350108 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6knxw\" (UniqueName: \"kubernetes.io/projected/706af7ed-b512-4c1e-852d-9cd18155209d-kube-api-access-6knxw\") pod \"kserve-controller-manager-6986579898-v9pgs\" (UID: \"706af7ed-b512-4c1e-852d-9cd18155209d\") " pod="kserve/kserve-controller-manager-6986579898-v9pgs" Apr 16 16:56:31.361914 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.361895 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6986579898-z4zlk" Apr 16 16:56:31.391348 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.391257 2568 generic.go:358] "Generic (PLEG): container finished" podID="996ee467-7cdf-4397-bb8c-fb94c379d445" containerID="46518af34031744f2fb367bb563a38deeb193dead98e307c2af754994a57043c" exitCode=0 Apr 16 16:56:31.391348 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.391316 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6986579898-z4zlk" event={"ID":"996ee467-7cdf-4397-bb8c-fb94c379d445","Type":"ContainerDied","Data":"46518af34031744f2fb367bb563a38deeb193dead98e307c2af754994a57043c"} Apr 16 16:56:31.391348 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.391335 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6986579898-z4zlk" Apr 16 16:56:31.391647 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.391354 2568 scope.go:117] "RemoveContainer" containerID="46518af34031744f2fb367bb563a38deeb193dead98e307c2af754994a57043c" Apr 16 16:56:31.391647 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.391344 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6986579898-z4zlk" event={"ID":"996ee467-7cdf-4397-bb8c-fb94c379d445","Type":"ContainerDied","Data":"fea244c49e57724dd5c09dbf7d644d209178255a20b0d8ca7a325948779391df"} Apr 16 16:56:31.399699 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.399668 2568 scope.go:117] "RemoveContainer" containerID="46518af34031744f2fb367bb563a38deeb193dead98e307c2af754994a57043c" Apr 16 16:56:31.400070 ip-10-0-131-63 kubenswrapper[2568]: E0416 16:56:31.400034 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46518af34031744f2fb367bb563a38deeb193dead98e307c2af754994a57043c\": container with ID starting with 46518af34031744f2fb367bb563a38deeb193dead98e307c2af754994a57043c not found: ID does not exist" containerID="46518af34031744f2fb367bb563a38deeb193dead98e307c2af754994a57043c" Apr 16 16:56:31.400177 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.400082 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46518af34031744f2fb367bb563a38deeb193dead98e307c2af754994a57043c"} err="failed to get container status \"46518af34031744f2fb367bb563a38deeb193dead98e307c2af754994a57043c\": rpc error: code = NotFound desc = could not find container \"46518af34031744f2fb367bb563a38deeb193dead98e307c2af754994a57043c\": container with ID starting with 46518af34031744f2fb367bb563a38deeb193dead98e307c2af754994a57043c not found: ID does not exist" Apr 16 16:56:31.442358 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.442327 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6dgw\" (UniqueName: \"kubernetes.io/projected/996ee467-7cdf-4397-bb8c-fb94c379d445-kube-api-access-t6dgw\") pod \"996ee467-7cdf-4397-bb8c-fb94c379d445\" (UID: \"996ee467-7cdf-4397-bb8c-fb94c379d445\") " Apr 16 16:56:31.442512 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.442412 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/996ee467-7cdf-4397-bb8c-fb94c379d445-cert\") pod \"996ee467-7cdf-4397-bb8c-fb94c379d445\" (UID: \"996ee467-7cdf-4397-bb8c-fb94c379d445\") " Apr 16 16:56:31.444483 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.444457 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/996ee467-7cdf-4397-bb8c-fb94c379d445-kube-api-access-t6dgw" (OuterVolumeSpecName: "kube-api-access-t6dgw") pod "996ee467-7cdf-4397-bb8c-fb94c379d445" (UID: "996ee467-7cdf-4397-bb8c-fb94c379d445"). InnerVolumeSpecName "kube-api-access-t6dgw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:56:31.444573 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.444477 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/996ee467-7cdf-4397-bb8c-fb94c379d445-cert" (OuterVolumeSpecName: "cert") pod "996ee467-7cdf-4397-bb8c-fb94c379d445" (UID: "996ee467-7cdf-4397-bb8c-fb94c379d445"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:56:31.502771 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.502727 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6986579898-v9pgs" Apr 16 16:56:31.543783 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.543734 2568 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/996ee467-7cdf-4397-bb8c-fb94c379d445-cert\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:56:31.543783 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.543779 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t6dgw\" (UniqueName: \"kubernetes.io/projected/996ee467-7cdf-4397-bb8c-fb94c379d445-kube-api-access-t6dgw\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:56:31.620785 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.620763 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6986579898-v9pgs"] Apr 16 16:56:31.623433 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:56:31.623405 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod706af7ed_b512_4c1e_852d_9cd18155209d.slice/crio-bb7cb3eefe907dcb7d27328a7ac93a73d10b1c537f5780ff46f8852e6f3c037d WatchSource:0}: Error finding container bb7cb3eefe907dcb7d27328a7ac93a73d10b1c537f5780ff46f8852e6f3c037d: Status 404 returned error can't find the container with id bb7cb3eefe907dcb7d27328a7ac93a73d10b1c537f5780ff46f8852e6f3c037d Apr 16 16:56:31.711860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.711829 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6986579898-z4zlk"] Apr 16 16:56:31.714095 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.714069 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-6986579898-z4zlk"] Apr 16 16:56:31.933778 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:31.933746 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="996ee467-7cdf-4397-bb8c-fb94c379d445" path="/var/lib/kubelet/pods/996ee467-7cdf-4397-bb8c-fb94c379d445/volumes" Apr 16 16:56:32.396330 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:32.396254 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6986579898-v9pgs" event={"ID":"706af7ed-b512-4c1e-852d-9cd18155209d","Type":"ContainerStarted","Data":"ff6cfc68e63009cbb82c1a8f6495d19104b32770bc9261921aaf32623ebf6291"} Apr 16 16:56:32.396330 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:32.396288 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6986579898-v9pgs" event={"ID":"706af7ed-b512-4c1e-852d-9cd18155209d","Type":"ContainerStarted","Data":"bb7cb3eefe907dcb7d27328a7ac93a73d10b1c537f5780ff46f8852e6f3c037d"} Apr 16 16:56:32.396754 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:32.396401 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6986579898-v9pgs" Apr 16 16:56:32.413792 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:56:32.413746 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6986579898-v9pgs" podStartSLOduration=1.028035071 podStartE2EDuration="1.413730779s" podCreationTimestamp="2026-04-16 16:56:31 +0000 UTC" firstStartedPulling="2026-04-16 16:56:31.624727502 +0000 UTC m=+528.294904263" lastFinishedPulling="2026-04-16 16:56:32.01042321 +0000 UTC m=+528.680599971" observedRunningTime="2026-04-16 16:56:32.412039695 +0000 UTC m=+529.082216498" watchObservedRunningTime="2026-04-16 16:56:32.413730779 +0000 UTC m=+529.083907558" Apr 16 16:57:03.404621 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:03.404588 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6986579898-v9pgs" Apr 16 16:57:19.859437 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:19.859400 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-kh7vv"] Apr 16 16:57:19.859814 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:19.859698 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="996ee467-7cdf-4397-bb8c-fb94c379d445" containerName="manager" Apr 16 16:57:19.859814 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:19.859712 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="996ee467-7cdf-4397-bb8c-fb94c379d445" containerName="manager" Apr 16 16:57:19.859814 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:19.859791 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="996ee467-7cdf-4397-bb8c-fb94c379d445" containerName="manager" Apr 16 16:57:19.862655 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:19.862638 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-kh7vv" Apr 16 16:57:19.865672 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:19.865647 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 16:57:19.865794 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:19.865672 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-tvxrm\"" Apr 16 16:57:19.866849 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:19.866828 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-kh7vv"] Apr 16 16:57:19.907625 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:19.907586 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzrxr\" (UniqueName: \"kubernetes.io/projected/08d72bfb-868c-436d-9d60-df4b01ca02da-kube-api-access-xzrxr\") pod \"s3-init-kh7vv\" (UID: \"08d72bfb-868c-436d-9d60-df4b01ca02da\") " pod="kserve/s3-init-kh7vv" Apr 16 16:57:20.008099 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:20.008060 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzrxr\" (UniqueName: \"kubernetes.io/projected/08d72bfb-868c-436d-9d60-df4b01ca02da-kube-api-access-xzrxr\") pod \"s3-init-kh7vv\" (UID: \"08d72bfb-868c-436d-9d60-df4b01ca02da\") " pod="kserve/s3-init-kh7vv" Apr 16 16:57:20.016491 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:20.016466 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzrxr\" (UniqueName: \"kubernetes.io/projected/08d72bfb-868c-436d-9d60-df4b01ca02da-kube-api-access-xzrxr\") pod \"s3-init-kh7vv\" (UID: \"08d72bfb-868c-436d-9d60-df4b01ca02da\") " pod="kserve/s3-init-kh7vv" Apr 16 16:57:20.183470 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:20.183378 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-kh7vv" Apr 16 16:57:20.301608 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:20.301575 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-kh7vv"] Apr 16 16:57:20.304492 ip-10-0-131-63 kubenswrapper[2568]: W0416 16:57:20.304455 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08d72bfb_868c_436d_9d60_df4b01ca02da.slice/crio-694a70d1d93483b403013eee8be935b86127f2dd3b7e15e3957ff1f3abd9e466 WatchSource:0}: Error finding container 694a70d1d93483b403013eee8be935b86127f2dd3b7e15e3957ff1f3abd9e466: Status 404 returned error can't find the container with id 694a70d1d93483b403013eee8be935b86127f2dd3b7e15e3957ff1f3abd9e466 Apr 16 16:57:20.527102 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:20.527064 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-kh7vv" event={"ID":"08d72bfb-868c-436d-9d60-df4b01ca02da","Type":"ContainerStarted","Data":"694a70d1d93483b403013eee8be935b86127f2dd3b7e15e3957ff1f3abd9e466"} Apr 16 16:57:25.544489 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:25.544447 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-kh7vv" event={"ID":"08d72bfb-868c-436d-9d60-df4b01ca02da","Type":"ContainerStarted","Data":"567701f521f9b5441746e459d54e92b145e046e2054f0bcdc2c2554c90814cc2"} Apr 16 16:57:25.560880 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:25.560831 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-kh7vv" podStartSLOduration=2.15689771 podStartE2EDuration="6.560816493s" podCreationTimestamp="2026-04-16 16:57:19 +0000 UTC" firstStartedPulling="2026-04-16 16:57:20.306381927 +0000 UTC m=+576.976558683" lastFinishedPulling="2026-04-16 16:57:24.71030071 +0000 UTC m=+581.380477466" observedRunningTime="2026-04-16 16:57:25.559154565 +0000 UTC m=+582.229331358" watchObservedRunningTime="2026-04-16 16:57:25.560816493 +0000 UTC m=+582.230993272" Apr 16 16:57:28.553991 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:28.553941 2568 generic.go:358] "Generic (PLEG): container finished" podID="08d72bfb-868c-436d-9d60-df4b01ca02da" containerID="567701f521f9b5441746e459d54e92b145e046e2054f0bcdc2c2554c90814cc2" exitCode=0 Apr 16 16:57:28.554352 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:28.554013 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-kh7vv" event={"ID":"08d72bfb-868c-436d-9d60-df4b01ca02da","Type":"ContainerDied","Data":"567701f521f9b5441746e459d54e92b145e046e2054f0bcdc2c2554c90814cc2"} Apr 16 16:57:29.680015 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:29.679995 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-kh7vv" Apr 16 16:57:29.799751 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:29.799704 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzrxr\" (UniqueName: \"kubernetes.io/projected/08d72bfb-868c-436d-9d60-df4b01ca02da-kube-api-access-xzrxr\") pod \"08d72bfb-868c-436d-9d60-df4b01ca02da\" (UID: \"08d72bfb-868c-436d-9d60-df4b01ca02da\") " Apr 16 16:57:29.801813 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:29.801787 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08d72bfb-868c-436d-9d60-df4b01ca02da-kube-api-access-xzrxr" (OuterVolumeSpecName: "kube-api-access-xzrxr") pod "08d72bfb-868c-436d-9d60-df4b01ca02da" (UID: "08d72bfb-868c-436d-9d60-df4b01ca02da"). InnerVolumeSpecName "kube-api-access-xzrxr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:57:29.901287 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:29.901202 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xzrxr\" (UniqueName: \"kubernetes.io/projected/08d72bfb-868c-436d-9d60-df4b01ca02da-kube-api-access-xzrxr\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 16:57:30.560860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:30.560822 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-kh7vv" event={"ID":"08d72bfb-868c-436d-9d60-df4b01ca02da","Type":"ContainerDied","Data":"694a70d1d93483b403013eee8be935b86127f2dd3b7e15e3957ff1f3abd9e466"} Apr 16 16:57:30.560860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:30.560845 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-kh7vv" Apr 16 16:57:30.560860 ip-10-0-131-63 kubenswrapper[2568]: I0416 16:57:30.560855 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="694a70d1d93483b403013eee8be935b86127f2dd3b7e15e3957ff1f3abd9e466" Apr 16 17:02:30.304768 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:02:30.304733 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-dc743-predictor-565764948c-52wnv"] Apr 16 17:02:30.305348 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:02:30.305022 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08d72bfb-868c-436d-9d60-df4b01ca02da" containerName="s3-init" Apr 16 17:02:30.305348 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:02:30.305034 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d72bfb-868c-436d-9d60-df4b01ca02da" containerName="s3-init" Apr 16 17:02:30.305348 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:02:30.305081 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="08d72bfb-868c-436d-9d60-df4b01ca02da" containerName="s3-init" Apr 16 17:02:30.308134 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:02:30.308114 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-dc743-predictor-565764948c-52wnv" Apr 16 17:02:30.310675 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:02:30.310651 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-bgqcs\"" Apr 16 17:02:30.316778 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:02:30.316754 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-dc743-predictor-565764948c-52wnv"] Apr 16 17:02:30.344977 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:02:30.344570 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-dc743-predictor-565764948c-52wnv" Apr 16 17:02:30.470025 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:02:30.469996 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-dc743-predictor-565764948c-52wnv"] Apr 16 17:02:30.472706 ip-10-0-131-63 kubenswrapper[2568]: W0416 17:02:30.472680 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38ae0f9c_2d58_4bac_a4bc_9e987a6aa202.slice/crio-0a4a69973b9964d1dedd35a90a8d2bdefdc05c16068d7a52c19cc2d5eb1e0d2a WatchSource:0}: Error finding container 0a4a69973b9964d1dedd35a90a8d2bdefdc05c16068d7a52c19cc2d5eb1e0d2a: Status 404 returned error can't find the container with id 0a4a69973b9964d1dedd35a90a8d2bdefdc05c16068d7a52c19cc2d5eb1e0d2a Apr 16 17:02:30.474930 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:02:30.474912 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:02:31.390342 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:02:31.390307 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-dc743-predictor-565764948c-52wnv" event={"ID":"38ae0f9c-2d58-4bac-a4bc-9e987a6aa202","Type":"ContainerStarted","Data":"0a4a69973b9964d1dedd35a90a8d2bdefdc05c16068d7a52c19cc2d5eb1e0d2a"} Apr 16 17:02:32.395198 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:02:32.395163 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-dc743-predictor-565764948c-52wnv" event={"ID":"38ae0f9c-2d58-4bac-a4bc-9e987a6aa202","Type":"ContainerStarted","Data":"6365b1deec424cbe4e20f79cff9cea35c60706c29bfd6ac206d013646d0701c1"} Apr 16 17:02:32.395624 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:02:32.395474 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-dc743-predictor-565764948c-52wnv" Apr 16 17:02:32.397175 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:02:32.397153 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-dc743-predictor-565764948c-52wnv" Apr 16 17:02:32.411753 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:02:32.411705 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-dc743-predictor-565764948c-52wnv" podStartSLOduration=1.402625368 podStartE2EDuration="2.411691261s" podCreationTimestamp="2026-04-16 17:02:30 +0000 UTC" firstStartedPulling="2026-04-16 17:02:30.475039054 +0000 UTC m=+887.145215815" lastFinishedPulling="2026-04-16 17:02:31.484104942 +0000 UTC m=+888.154281708" observedRunningTime="2026-04-16 17:02:32.409978697 +0000 UTC m=+889.080155478" watchObservedRunningTime="2026-04-16 17:02:32.411691261 +0000 UTC m=+889.081868040" Apr 16 17:10:25.188480 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:25.188404 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-khblv/must-gather-fdhvb"] Apr 16 17:10:25.191684 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:25.191654 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-khblv/must-gather-fdhvb" Apr 16 17:10:25.194536 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:25.194498 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-khblv\"/\"default-dockercfg-rwhvd\"" Apr 16 17:10:25.194786 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:25.194769 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-khblv\"/\"openshift-service-ca.crt\"" Apr 16 17:10:25.195844 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:25.195829 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-khblv\"/\"kube-root-ca.crt\"" Apr 16 17:10:25.198990 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:25.198968 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-khblv/must-gather-fdhvb"] Apr 16 17:10:25.334808 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:25.334747 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5v64\" (UniqueName: \"kubernetes.io/projected/c298f851-45dd-4da4-8f66-597d10c984ea-kube-api-access-r5v64\") pod \"must-gather-fdhvb\" (UID: \"c298f851-45dd-4da4-8f66-597d10c984ea\") " pod="openshift-must-gather-khblv/must-gather-fdhvb" Apr 16 17:10:25.334984 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:25.334897 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c298f851-45dd-4da4-8f66-597d10c984ea-must-gather-output\") pod \"must-gather-fdhvb\" (UID: \"c298f851-45dd-4da4-8f66-597d10c984ea\") " pod="openshift-must-gather-khblv/must-gather-fdhvb" Apr 16 17:10:25.435937 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:25.435900 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c298f851-45dd-4da4-8f66-597d10c984ea-must-gather-output\") pod \"must-gather-fdhvb\" (UID: \"c298f851-45dd-4da4-8f66-597d10c984ea\") " pod="openshift-must-gather-khblv/must-gather-fdhvb" Apr 16 17:10:25.436110 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:25.435957 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5v64\" (UniqueName: \"kubernetes.io/projected/c298f851-45dd-4da4-8f66-597d10c984ea-kube-api-access-r5v64\") pod \"must-gather-fdhvb\" (UID: \"c298f851-45dd-4da4-8f66-597d10c984ea\") " pod="openshift-must-gather-khblv/must-gather-fdhvb" Apr 16 17:10:25.436208 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:25.436188 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c298f851-45dd-4da4-8f66-597d10c984ea-must-gather-output\") pod \"must-gather-fdhvb\" (UID: \"c298f851-45dd-4da4-8f66-597d10c984ea\") " pod="openshift-must-gather-khblv/must-gather-fdhvb" Apr 16 17:10:25.444623 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:25.444557 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5v64\" (UniqueName: \"kubernetes.io/projected/c298f851-45dd-4da4-8f66-597d10c984ea-kube-api-access-r5v64\") pod \"must-gather-fdhvb\" (UID: \"c298f851-45dd-4da4-8f66-597d10c984ea\") " pod="openshift-must-gather-khblv/must-gather-fdhvb" Apr 16 17:10:25.511667 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:25.511631 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-khblv/must-gather-fdhvb" Apr 16 17:10:25.627169 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:25.627012 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-khblv/must-gather-fdhvb"] Apr 16 17:10:25.630159 ip-10-0-131-63 kubenswrapper[2568]: W0416 17:10:25.630134 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc298f851_45dd_4da4_8f66_597d10c984ea.slice/crio-7a0503a0d8115b3ae540624d35eb114d9807d1064e66a57f011be3294ad79617 WatchSource:0}: Error finding container 7a0503a0d8115b3ae540624d35eb114d9807d1064e66a57f011be3294ad79617: Status 404 returned error can't find the container with id 7a0503a0d8115b3ae540624d35eb114d9807d1064e66a57f011be3294ad79617 Apr 16 17:10:25.632293 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:25.632273 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:10:25.689323 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:25.689285 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-khblv/must-gather-fdhvb" event={"ID":"c298f851-45dd-4da4-8f66-597d10c984ea","Type":"ContainerStarted","Data":"7a0503a0d8115b3ae540624d35eb114d9807d1064e66a57f011be3294ad79617"} Apr 16 17:10:30.708612 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:30.708571 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-khblv/must-gather-fdhvb" event={"ID":"c298f851-45dd-4da4-8f66-597d10c984ea","Type":"ContainerStarted","Data":"97afc34c61876bff031f4bbc53d0ccf33c9d922eaee2ae8a1441ba46bd68a236"} Apr 16 17:10:30.708612 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:30.708616 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-khblv/must-gather-fdhvb" event={"ID":"c298f851-45dd-4da4-8f66-597d10c984ea","Type":"ContainerStarted","Data":"55d5ae7e0ac22a9d17fabd58d3feaa4daa03782c371eb0848f2f2610e9a3badc"} Apr 16 17:10:30.724684 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:30.724632 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-khblv/must-gather-fdhvb" podStartSLOduration=1.210207804 podStartE2EDuration="5.724616544s" podCreationTimestamp="2026-04-16 17:10:25 +0000 UTC" firstStartedPulling="2026-04-16 17:10:25.632435245 +0000 UTC m=+1362.302612006" lastFinishedPulling="2026-04-16 17:10:30.14684399 +0000 UTC m=+1366.817020746" observedRunningTime="2026-04-16 17:10:30.723882357 +0000 UTC m=+1367.394059139" watchObservedRunningTime="2026-04-16 17:10:30.724616544 +0000 UTC m=+1367.394793327" Apr 16 17:10:39.284614 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:39.284584 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-dc743-predictor-565764948c-52wnv_38ae0f9c-2d58-4bac-a4bc-9e987a6aa202/kserve-container/0.log" Apr 16 17:10:40.105230 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:40.105202 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-dc743-predictor-565764948c-52wnv_38ae0f9c-2d58-4bac-a4bc-9e987a6aa202/kserve-container/0.log" Apr 16 17:10:40.931721 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:40.931690 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-dc743-predictor-565764948c-52wnv_38ae0f9c-2d58-4bac-a4bc-9e987a6aa202/kserve-container/0.log" Apr 16 17:10:41.704586 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:41.704559 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-dc743-predictor-565764948c-52wnv_38ae0f9c-2d58-4bac-a4bc-9e987a6aa202/kserve-container/0.log" Apr 16 17:10:42.473066 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:42.473039 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-dc743-predictor-565764948c-52wnv_38ae0f9c-2d58-4bac-a4bc-9e987a6aa202/kserve-container/0.log" Apr 16 17:10:43.258185 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:43.258156 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-dc743-predictor-565764948c-52wnv_38ae0f9c-2d58-4bac-a4bc-9e987a6aa202/kserve-container/0.log" Apr 16 17:10:44.060997 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:44.060969 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-dc743-predictor-565764948c-52wnv_38ae0f9c-2d58-4bac-a4bc-9e987a6aa202/kserve-container/0.log" Apr 16 17:10:44.853363 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:44.853328 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-dc743-predictor-565764948c-52wnv_38ae0f9c-2d58-4bac-a4bc-9e987a6aa202/kserve-container/0.log" Apr 16 17:10:45.649856 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:45.649826 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-dc743-predictor-565764948c-52wnv_38ae0f9c-2d58-4bac-a4bc-9e987a6aa202/kserve-container/0.log" Apr 16 17:10:46.454066 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:46.454036 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-dc743-predictor-565764948c-52wnv_38ae0f9c-2d58-4bac-a4bc-9e987a6aa202/kserve-container/0.log" Apr 16 17:10:47.316242 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:47.316210 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-dc743-predictor-565764948c-52wnv_38ae0f9c-2d58-4bac-a4bc-9e987a6aa202/kserve-container/0.log" Apr 16 17:10:48.171644 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:48.171619 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-dc743-predictor-565764948c-52wnv_38ae0f9c-2d58-4bac-a4bc-9e987a6aa202/kserve-container/0.log" Apr 16 17:10:49.023912 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:49.023875 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-dc743-predictor-565764948c-52wnv_38ae0f9c-2d58-4bac-a4bc-9e987a6aa202/kserve-container/0.log" Apr 16 17:10:49.875347 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:49.875317 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-dc743-predictor-565764948c-52wnv_38ae0f9c-2d58-4bac-a4bc-9e987a6aa202/kserve-container/0.log" Apr 16 17:10:50.723384 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:50.723353 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-dc743-predictor-565764948c-52wnv_38ae0f9c-2d58-4bac-a4bc-9e987a6aa202/kserve-container/0.log" Apr 16 17:10:51.604362 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:51.604333 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-dc743-predictor-565764948c-52wnv_38ae0f9c-2d58-4bac-a4bc-9e987a6aa202/kserve-container/0.log" Apr 16 17:10:52.558078 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:52.558047 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-dc743-predictor-565764948c-52wnv_38ae0f9c-2d58-4bac-a4bc-9e987a6aa202/kserve-container/0.log" Apr 16 17:10:53.547762 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:53.547723 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-dc743-predictor-565764948c-52wnv_38ae0f9c-2d58-4bac-a4bc-9e987a6aa202/kserve-container/0.log" Apr 16 17:10:54.779279 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:54.779246 2568 generic.go:358] "Generic (PLEG): container finished" podID="c298f851-45dd-4da4-8f66-597d10c984ea" containerID="55d5ae7e0ac22a9d17fabd58d3feaa4daa03782c371eb0848f2f2610e9a3badc" exitCode=0 Apr 16 17:10:54.779890 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:54.779299 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-khblv/must-gather-fdhvb" event={"ID":"c298f851-45dd-4da4-8f66-597d10c984ea","Type":"ContainerDied","Data":"55d5ae7e0ac22a9d17fabd58d3feaa4daa03782c371eb0848f2f2610e9a3badc"} Apr 16 17:10:54.779890 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:54.779666 2568 scope.go:117] "RemoveContainer" containerID="55d5ae7e0ac22a9d17fabd58d3feaa4daa03782c371eb0848f2f2610e9a3badc" Apr 16 17:10:55.209888 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:55.209815 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-khblv_must-gather-fdhvb_c298f851-45dd-4da4-8f66-597d10c984ea/gather/0.log" Apr 16 17:10:55.843332 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:55.843297 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2fspv/must-gather-ktrvh"] Apr 16 17:10:55.846610 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:55.846586 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2fspv/must-gather-ktrvh" Apr 16 17:10:55.850789 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:55.850769 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2fspv\"/\"kube-root-ca.crt\"" Apr 16 17:10:55.850894 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:55.850769 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2fspv\"/\"default-dockercfg-f9wr2\"" Apr 16 17:10:55.850894 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:55.850861 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2fspv\"/\"openshift-service-ca.crt\"" Apr 16 17:10:55.859290 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:55.859258 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2fspv/must-gather-ktrvh"] Apr 16 17:10:55.895420 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:55.895381 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/40904e38-7fb8-46a6-9360-799fff6a964f-must-gather-output\") pod \"must-gather-ktrvh\" (UID: \"40904e38-7fb8-46a6-9360-799fff6a964f\") " pod="openshift-must-gather-2fspv/must-gather-ktrvh" Apr 16 17:10:55.895420 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:55.895422 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnmgq\" (UniqueName: \"kubernetes.io/projected/40904e38-7fb8-46a6-9360-799fff6a964f-kube-api-access-lnmgq\") pod \"must-gather-ktrvh\" (UID: \"40904e38-7fb8-46a6-9360-799fff6a964f\") " pod="openshift-must-gather-2fspv/must-gather-ktrvh" Apr 16 17:10:55.996758 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:55.996705 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/40904e38-7fb8-46a6-9360-799fff6a964f-must-gather-output\") pod \"must-gather-ktrvh\" (UID: \"40904e38-7fb8-46a6-9360-799fff6a964f\") " pod="openshift-must-gather-2fspv/must-gather-ktrvh" Apr 16 17:10:55.996758 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:55.996765 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lnmgq\" (UniqueName: \"kubernetes.io/projected/40904e38-7fb8-46a6-9360-799fff6a964f-kube-api-access-lnmgq\") pod \"must-gather-ktrvh\" (UID: \"40904e38-7fb8-46a6-9360-799fff6a964f\") " pod="openshift-must-gather-2fspv/must-gather-ktrvh" Apr 16 17:10:55.997045 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:55.997027 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/40904e38-7fb8-46a6-9360-799fff6a964f-must-gather-output\") pod \"must-gather-ktrvh\" (UID: \"40904e38-7fb8-46a6-9360-799fff6a964f\") " pod="openshift-must-gather-2fspv/must-gather-ktrvh" Apr 16 17:10:56.005937 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:56.005905 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnmgq\" (UniqueName: \"kubernetes.io/projected/40904e38-7fb8-46a6-9360-799fff6a964f-kube-api-access-lnmgq\") pod \"must-gather-ktrvh\" (UID: \"40904e38-7fb8-46a6-9360-799fff6a964f\") " pod="openshift-must-gather-2fspv/must-gather-ktrvh" Apr 16 17:10:56.155659 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:56.155574 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2fspv/must-gather-ktrvh" Apr 16 17:10:56.272479 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:56.272456 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2fspv/must-gather-ktrvh"] Apr 16 17:10:56.274134 ip-10-0-131-63 kubenswrapper[2568]: W0416 17:10:56.274102 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40904e38_7fb8_46a6_9360_799fff6a964f.slice/crio-05ba48a4f6b580f40521aa8073b67c76402b9eeb32d90ec77a1d8502dc1896c2 WatchSource:0}: Error finding container 05ba48a4f6b580f40521aa8073b67c76402b9eeb32d90ec77a1d8502dc1896c2: Status 404 returned error can't find the container with id 05ba48a4f6b580f40521aa8073b67c76402b9eeb32d90ec77a1d8502dc1896c2 Apr 16 17:10:56.786535 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:56.786480 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2fspv/must-gather-ktrvh" event={"ID":"40904e38-7fb8-46a6-9360-799fff6a964f","Type":"ContainerStarted","Data":"05ba48a4f6b580f40521aa8073b67c76402b9eeb32d90ec77a1d8502dc1896c2"} Apr 16 17:10:57.794545 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:57.794496 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2fspv/must-gather-ktrvh" event={"ID":"40904e38-7fb8-46a6-9360-799fff6a964f","Type":"ContainerStarted","Data":"d91b0dcfcee70faaf9a5af084ad6e244d7064947ef79921f3ce71604a7a1f3b9"} Apr 16 17:10:57.794545 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:57.794550 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2fspv/must-gather-ktrvh" event={"ID":"40904e38-7fb8-46a6-9360-799fff6a964f","Type":"ContainerStarted","Data":"0a85a808a72106037d706a03690a5a593342b5378d018690c3a9f37e9e736718"} Apr 16 17:10:57.810167 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:57.810111 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2fspv/must-gather-ktrvh" podStartSLOduration=1.89669608 podStartE2EDuration="2.810095505s" podCreationTimestamp="2026-04-16 17:10:55 +0000 UTC" firstStartedPulling="2026-04-16 17:10:56.275900255 +0000 UTC m=+1392.946077016" lastFinishedPulling="2026-04-16 17:10:57.189299676 +0000 UTC m=+1393.859476441" observedRunningTime="2026-04-16 17:10:57.80937355 +0000 UTC m=+1394.479550355" watchObservedRunningTime="2026-04-16 17:10:57.810095505 +0000 UTC m=+1394.480272325" Apr 16 17:10:58.558081 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:58.558048 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-4bqrw_5ae3c2f7-2de4-4c3c-9cf8-bc6213e92f77/global-pull-secret-syncer/0.log" Apr 16 17:10:58.673636 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:58.673598 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-d4xtv_49be5da9-85b8-4a24-8fec-db2c506efbbc/konnectivity-agent/0.log" Apr 16 17:10:58.790500 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:10:58.790475 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-63.ec2.internal_efe8b482663d36b8cd0f3c122fed91e1/haproxy/0.log" Apr 16 17:11:00.646664 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:00.645841 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-khblv/must-gather-fdhvb"] Apr 16 17:11:00.646664 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:00.646130 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-khblv/must-gather-fdhvb" podUID="c298f851-45dd-4da4-8f66-597d10c984ea" containerName="copy" containerID="cri-o://97afc34c61876bff031f4bbc53d0ccf33c9d922eaee2ae8a1441ba46bd68a236" gracePeriod=2 Apr 16 17:11:00.652199 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:00.652172 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-khblv/must-gather-fdhvb"] Apr 16 17:11:00.811172 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:00.811095 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-khblv_must-gather-fdhvb_c298f851-45dd-4da4-8f66-597d10c984ea/copy/0.log" Apr 16 17:11:00.811758 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:00.811639 2568 generic.go:358] "Generic (PLEG): container finished" podID="c298f851-45dd-4da4-8f66-597d10c984ea" containerID="97afc34c61876bff031f4bbc53d0ccf33c9d922eaee2ae8a1441ba46bd68a236" exitCode=143 Apr 16 17:11:01.059366 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:01.059299 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-khblv_must-gather-fdhvb_c298f851-45dd-4da4-8f66-597d10c984ea/copy/0.log" Apr 16 17:11:01.063379 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:01.063105 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-khblv/must-gather-fdhvb" Apr 16 17:11:01.066017 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:01.065963 2568 status_manager.go:895] "Failed to get status for pod" podUID="c298f851-45dd-4da4-8f66-597d10c984ea" pod="openshift-must-gather-khblv/must-gather-fdhvb" err="pods \"must-gather-fdhvb\" is forbidden: User \"system:node:ip-10-0-131-63.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-khblv\": no relationship found between node 'ip-10-0-131-63.ec2.internal' and this object" Apr 16 17:11:01.153551 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:01.150137 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5v64\" (UniqueName: \"kubernetes.io/projected/c298f851-45dd-4da4-8f66-597d10c984ea-kube-api-access-r5v64\") pod \"c298f851-45dd-4da4-8f66-597d10c984ea\" (UID: \"c298f851-45dd-4da4-8f66-597d10c984ea\") " Apr 16 17:11:01.153551 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:01.150230 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c298f851-45dd-4da4-8f66-597d10c984ea-must-gather-output\") pod \"c298f851-45dd-4da4-8f66-597d10c984ea\" (UID: \"c298f851-45dd-4da4-8f66-597d10c984ea\") " Apr 16 17:11:01.153551 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:01.153276 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c298f851-45dd-4da4-8f66-597d10c984ea-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c298f851-45dd-4da4-8f66-597d10c984ea" (UID: "c298f851-45dd-4da4-8f66-597d10c984ea"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:11:01.158718 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:01.158678 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c298f851-45dd-4da4-8f66-597d10c984ea-kube-api-access-r5v64" (OuterVolumeSpecName: "kube-api-access-r5v64") pod "c298f851-45dd-4da4-8f66-597d10c984ea" (UID: "c298f851-45dd-4da4-8f66-597d10c984ea"). InnerVolumeSpecName "kube-api-access-r5v64". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:11:01.251873 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:01.251840 2568 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c298f851-45dd-4da4-8f66-597d10c984ea-must-gather-output\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 17:11:01.252224 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:01.252203 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r5v64\" (UniqueName: \"kubernetes.io/projected/c298f851-45dd-4da4-8f66-597d10c984ea-kube-api-access-r5v64\") on node \"ip-10-0-131-63.ec2.internal\" DevicePath \"\"" Apr 16 17:11:01.816418 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:01.816383 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-khblv_must-gather-fdhvb_c298f851-45dd-4da4-8f66-597d10c984ea/copy/0.log" Apr 16 17:11:01.816943 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:01.816821 2568 scope.go:117] "RemoveContainer" containerID="97afc34c61876bff031f4bbc53d0ccf33c9d922eaee2ae8a1441ba46bd68a236" Apr 16 17:11:01.816995 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:01.816961 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-khblv/must-gather-fdhvb" Apr 16 17:11:01.825361 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:01.824870 2568 status_manager.go:895] "Failed to get status for pod" podUID="c298f851-45dd-4da4-8f66-597d10c984ea" pod="openshift-must-gather-khblv/must-gather-fdhvb" err="pods \"must-gather-fdhvb\" is forbidden: User \"system:node:ip-10-0-131-63.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-khblv\": no relationship found between node 'ip-10-0-131-63.ec2.internal' and this object" Apr 16 17:11:01.838440 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:01.836639 2568 scope.go:117] "RemoveContainer" containerID="55d5ae7e0ac22a9d17fabd58d3feaa4daa03782c371eb0848f2f2610e9a3badc" Apr 16 17:11:01.838440 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:01.836859 2568 status_manager.go:895] "Failed to get status for pod" podUID="c298f851-45dd-4da4-8f66-597d10c984ea" pod="openshift-must-gather-khblv/must-gather-fdhvb" err="pods \"must-gather-fdhvb\" is forbidden: User \"system:node:ip-10-0-131-63.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-khblv\": no relationship found between node 'ip-10-0-131-63.ec2.internal' and this object" Apr 16 17:11:01.936075 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:01.935489 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c298f851-45dd-4da4-8f66-597d10c984ea" path="/var/lib/kubelet/pods/c298f851-45dd-4da4-8f66-597d10c984ea/volumes" Apr 16 17:11:02.312025 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:02.311957 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-xwh89_073fb8a8-affb-434d-9a78-e8ef0444fc11/cluster-monitoring-operator/0.log" Apr 16 17:11:02.446912 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:02.446882 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-nxqlt_a8fa4d4b-1853-466d-a89a-f197d586e400/monitoring-plugin/0.log" Apr 16 17:11:02.580349 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:02.580249 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9g78g_4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e/node-exporter/0.log" Apr 16 17:11:02.609472 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:02.609441 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9g78g_4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e/kube-rbac-proxy/0.log" Apr 16 17:11:02.635354 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:02.635323 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9g78g_4e1fd5dc-ed1e-40be-a46a-6c2e9a2ef55e/init-textfile/0.log" Apr 16 17:11:02.836116 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:02.835873 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c30842ca-2edc-4775-afe7-8ca58bd3f029/prometheus/0.log" Apr 16 17:11:02.857443 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:02.857408 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c30842ca-2edc-4775-afe7-8ca58bd3f029/config-reloader/0.log" Apr 16 17:11:02.880507 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:02.880479 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c30842ca-2edc-4775-afe7-8ca58bd3f029/thanos-sidecar/0.log" Apr 16 17:11:02.902591 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:02.902560 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c30842ca-2edc-4775-afe7-8ca58bd3f029/kube-rbac-proxy-web/0.log" Apr 16 17:11:02.928039 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:02.928012 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c30842ca-2edc-4775-afe7-8ca58bd3f029/kube-rbac-proxy/0.log" Apr 16 17:11:02.953780 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:02.953747 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c30842ca-2edc-4775-afe7-8ca58bd3f029/kube-rbac-proxy-thanos/0.log" Apr 16 17:11:02.981990 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:02.981964 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c30842ca-2edc-4775-afe7-8ca58bd3f029/init-config-reloader/0.log" Apr 16 17:11:06.313055 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.313027 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ndbsz_e9c730fc-1a8b-4c50-92f3-2ebfd693c270/dns/0.log" Apr 16 17:11:06.333486 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.333462 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ndbsz_e9c730fc-1a8b-4c50-92f3-2ebfd693c270/kube-rbac-proxy/0.log" Apr 16 17:11:06.422672 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.422645 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ptmqw_0c447c29-c470-4314-becc-ad24580321c8/dns-node-resolver/0.log" Apr 16 17:11:06.661976 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.661897 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv"] Apr 16 17:11:06.662355 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.662328 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c298f851-45dd-4da4-8f66-597d10c984ea" containerName="copy" Apr 16 17:11:06.662467 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.662358 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c298f851-45dd-4da4-8f66-597d10c984ea" containerName="copy" Apr 16 17:11:06.662467 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.662385 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c298f851-45dd-4da4-8f66-597d10c984ea" containerName="gather" Apr 16 17:11:06.662467 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.662393 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c298f851-45dd-4da4-8f66-597d10c984ea" containerName="gather" Apr 16 17:11:06.662636 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.662476 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="c298f851-45dd-4da4-8f66-597d10c984ea" containerName="gather" Apr 16 17:11:06.662636 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.662489 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="c298f851-45dd-4da4-8f66-597d10c984ea" containerName="copy" Apr 16 17:11:06.667453 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.667430 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" Apr 16 17:11:06.673570 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.673547 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv"] Apr 16 17:11:06.799001 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.798965 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k8k8\" (UniqueName: \"kubernetes.io/projected/036e342e-b1e1-480d-8a57-ab9b029c94e8-kube-api-access-4k8k8\") pod \"perf-node-gather-daemonset-rr4pv\" (UID: \"036e342e-b1e1-480d-8a57-ab9b029c94e8\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" Apr 16 17:11:06.799151 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.799018 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/036e342e-b1e1-480d-8a57-ab9b029c94e8-lib-modules\") pod \"perf-node-gather-daemonset-rr4pv\" (UID: \"036e342e-b1e1-480d-8a57-ab9b029c94e8\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" Apr 16 17:11:06.799151 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.799037 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/036e342e-b1e1-480d-8a57-ab9b029c94e8-podres\") pod \"perf-node-gather-daemonset-rr4pv\" (UID: \"036e342e-b1e1-480d-8a57-ab9b029c94e8\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" Apr 16 17:11:06.799151 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.799080 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/036e342e-b1e1-480d-8a57-ab9b029c94e8-proc\") pod \"perf-node-gather-daemonset-rr4pv\" (UID: \"036e342e-b1e1-480d-8a57-ab9b029c94e8\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" Apr 16 17:11:06.799151 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.799104 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/036e342e-b1e1-480d-8a57-ab9b029c94e8-sys\") pod \"perf-node-gather-daemonset-rr4pv\" (UID: \"036e342e-b1e1-480d-8a57-ab9b029c94e8\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" Apr 16 17:11:06.873139 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.873110 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7655577c6d-7np48_9d1ef785-865a-4cf2-b57e-d3b55eea5a1c/registry/0.log" Apr 16 17:11:06.900581 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.900500 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/036e342e-b1e1-480d-8a57-ab9b029c94e8-lib-modules\") pod \"perf-node-gather-daemonset-rr4pv\" (UID: \"036e342e-b1e1-480d-8a57-ab9b029c94e8\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" Apr 16 17:11:06.900581 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.900583 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/036e342e-b1e1-480d-8a57-ab9b029c94e8-podres\") pod \"perf-node-gather-daemonset-rr4pv\" (UID: \"036e342e-b1e1-480d-8a57-ab9b029c94e8\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" Apr 16 17:11:06.900899 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.900635 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/036e342e-b1e1-480d-8a57-ab9b029c94e8-proc\") pod \"perf-node-gather-daemonset-rr4pv\" (UID: \"036e342e-b1e1-480d-8a57-ab9b029c94e8\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" Apr 16 17:11:06.900899 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.900672 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/036e342e-b1e1-480d-8a57-ab9b029c94e8-sys\") pod \"perf-node-gather-daemonset-rr4pv\" (UID: \"036e342e-b1e1-480d-8a57-ab9b029c94e8\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" Apr 16 17:11:06.900899 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.900720 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/036e342e-b1e1-480d-8a57-ab9b029c94e8-lib-modules\") pod \"perf-node-gather-daemonset-rr4pv\" (UID: \"036e342e-b1e1-480d-8a57-ab9b029c94e8\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" Apr 16 17:11:06.900899 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.900741 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/036e342e-b1e1-480d-8a57-ab9b029c94e8-podres\") pod \"perf-node-gather-daemonset-rr4pv\" (UID: \"036e342e-b1e1-480d-8a57-ab9b029c94e8\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" Apr 16 17:11:06.900899 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.900754 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4k8k8\" (UniqueName: \"kubernetes.io/projected/036e342e-b1e1-480d-8a57-ab9b029c94e8-kube-api-access-4k8k8\") pod \"perf-node-gather-daemonset-rr4pv\" (UID: \"036e342e-b1e1-480d-8a57-ab9b029c94e8\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" Apr 16 17:11:06.900899 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.900778 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/036e342e-b1e1-480d-8a57-ab9b029c94e8-proc\") pod \"perf-node-gather-daemonset-rr4pv\" (UID: \"036e342e-b1e1-480d-8a57-ab9b029c94e8\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" Apr 16 17:11:06.900899 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.900792 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/036e342e-b1e1-480d-8a57-ab9b029c94e8-sys\") pod \"perf-node-gather-daemonset-rr4pv\" (UID: \"036e342e-b1e1-480d-8a57-ab9b029c94e8\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" Apr 16 17:11:06.908505 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.908471 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k8k8\" (UniqueName: \"kubernetes.io/projected/036e342e-b1e1-480d-8a57-ab9b029c94e8-kube-api-access-4k8k8\") pod \"perf-node-gather-daemonset-rr4pv\" (UID: \"036e342e-b1e1-480d-8a57-ab9b029c94e8\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" Apr 16 17:11:06.913926 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.913861 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j8nrt_c39d9cc0-8cac-46bd-968d-baba878cd954/node-ca/0.log" Apr 16 17:11:06.978878 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:06.978840 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" Apr 16 17:11:07.110175 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:07.110142 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv"] Apr 16 17:11:07.113699 ip-10-0-131-63 kubenswrapper[2568]: W0416 17:11:07.113653 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod036e342e_b1e1_480d_8a57_ab9b029c94e8.slice/crio-feac784817d0418335400383f3b50f4c1056c44fbfe8435bbcb8b6b987b92427 WatchSource:0}: Error finding container feac784817d0418335400383f3b50f4c1056c44fbfe8435bbcb8b6b987b92427: Status 404 returned error can't find the container with id feac784817d0418335400383f3b50f4c1056c44fbfe8435bbcb8b6b987b92427 Apr 16 17:11:07.847897 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:07.847008 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" event={"ID":"036e342e-b1e1-480d-8a57-ab9b029c94e8","Type":"ContainerStarted","Data":"5a0a389a537d28b7b51413cfd7b39e13d485e04353ef74e941901789f09d97ea"} Apr 16 17:11:07.847897 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:07.847062 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" event={"ID":"036e342e-b1e1-480d-8a57-ab9b029c94e8","Type":"ContainerStarted","Data":"feac784817d0418335400383f3b50f4c1056c44fbfe8435bbcb8b6b987b92427"} Apr 16 17:11:07.847897 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:07.847881 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" Apr 16 17:11:07.862159 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:07.862113 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" podStartSLOduration=1.86209947 podStartE2EDuration="1.86209947s" podCreationTimestamp="2026-04-16 17:11:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:11:07.860749911 +0000 UTC m=+1404.530926716" watchObservedRunningTime="2026-04-16 17:11:07.86209947 +0000 UTC m=+1404.532276296" Apr 16 17:11:07.945339 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:07.945312 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4nrrn_af4910d1-f39b-44e2-805e-bcc17c0e30d0/serve-healthcheck-canary/0.log" Apr 16 17:11:08.338535 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:08.338489 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ctpwk_8123167b-2df8-460e-ab9c-5bdd6b4a099f/kube-rbac-proxy/0.log" Apr 16 17:11:08.358536 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:08.358493 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ctpwk_8123167b-2df8-460e-ab9c-5bdd6b4a099f/exporter/0.log" Apr 16 17:11:08.379198 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:08.379176 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ctpwk_8123167b-2df8-460e-ab9c-5bdd6b4a099f/extractor/0.log" Apr 16 17:11:10.381628 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:10.381596 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-6986579898-v9pgs_706af7ed-b512-4c1e-852d-9cd18155209d/manager/0.log" Apr 16 17:11:10.529035 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:10.529009 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-kh7vv_08d72bfb-868c-436d-9d60-df4b01ca02da/s3-init/0.log" Apr 16 17:11:14.863399 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:14.863368 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-rr4pv" Apr 16 17:11:15.245340 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:15.245267 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-brchx_482c17e3-998c-48aa-b158-037aa6ebf920/kube-multus-additional-cni-plugins/0.log" Apr 16 17:11:15.266661 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:15.266639 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-brchx_482c17e3-998c-48aa-b158-037aa6ebf920/egress-router-binary-copy/0.log" Apr 16 17:11:15.286665 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:15.286641 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-brchx_482c17e3-998c-48aa-b158-037aa6ebf920/cni-plugins/0.log" Apr 16 17:11:15.309359 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:15.309329 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-brchx_482c17e3-998c-48aa-b158-037aa6ebf920/bond-cni-plugin/0.log" Apr 16 17:11:15.329650 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:15.329625 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-brchx_482c17e3-998c-48aa-b158-037aa6ebf920/routeoverride-cni/0.log" Apr 16 17:11:15.349795 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:15.349770 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-brchx_482c17e3-998c-48aa-b158-037aa6ebf920/whereabouts-cni-bincopy/0.log" Apr 16 17:11:15.371205 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:15.371180 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-brchx_482c17e3-998c-48aa-b158-037aa6ebf920/whereabouts-cni/0.log" Apr 16 17:11:15.576350 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:15.576317 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nwmhb_8d03bf36-9222-4d14-b16d-5f31b197f11a/kube-multus/0.log" Apr 16 17:11:15.620260 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:15.620236 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5rwjz_65f280f9-caf6-429e-ac03-31bd647a05b6/network-metrics-daemon/0.log" Apr 16 17:11:15.641075 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:15.641050 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5rwjz_65f280f9-caf6-429e-ac03-31bd647a05b6/kube-rbac-proxy/0.log" Apr 16 17:11:17.075713 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:17.075681 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s2x9b_40202585-b938-4a5c-bde8-ac1c5ea40044/ovn-controller/0.log" Apr 16 17:11:17.105682 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:17.105605 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s2x9b_40202585-b938-4a5c-bde8-ac1c5ea40044/ovn-acl-logging/0.log" Apr 16 17:11:17.127727 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:17.127689 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s2x9b_40202585-b938-4a5c-bde8-ac1c5ea40044/kube-rbac-proxy-node/0.log" Apr 16 17:11:17.153384 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:17.153355 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s2x9b_40202585-b938-4a5c-bde8-ac1c5ea40044/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 17:11:17.176159 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:17.176128 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s2x9b_40202585-b938-4a5c-bde8-ac1c5ea40044/northd/0.log" Apr 16 17:11:17.203725 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:17.203697 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s2x9b_40202585-b938-4a5c-bde8-ac1c5ea40044/nbdb/0.log" Apr 16 17:11:17.227024 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:17.227003 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s2x9b_40202585-b938-4a5c-bde8-ac1c5ea40044/sbdb/0.log" Apr 16 17:11:17.363915 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:17.363867 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s2x9b_40202585-b938-4a5c-bde8-ac1c5ea40044/ovnkube-controller/0.log" Apr 16 17:11:18.419427 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:18.419399 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-ts7hl_018fd32e-3479-4227-9d81-8a232b27fc2b/network-check-target-container/0.log" Apr 16 17:11:19.422377 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:19.422348 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-j6gfs_1f2cee13-cf01-40a3-993d-f9a41ddeae81/iptables-alerter/0.log" Apr 16 17:11:19.974416 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:19.974382 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-7p8g2_9227133b-0000-4b75-818e-9c2ed2ffe214/tuned/0.log" Apr 16 17:11:21.753042 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:21.752958 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-667775844f-zn9sv_5d7eab9c-0611-4981-95ed-6811b8ca42d6/cluster-samples-operator/0.log" Apr 16 17:11:21.768215 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:21.768190 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-667775844f-zn9sv_5d7eab9c-0611-4981-95ed-6811b8ca42d6/cluster-samples-operator-watch/0.log" Apr 16 17:11:22.643088 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:22.643056 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-69965bb79d-p8zc8_a7ed2b8a-4851-4792-868d-23a18751df58/service-ca-operator/1.log" Apr 16 17:11:22.644091 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:22.644070 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-69965bb79d-p8zc8_a7ed2b8a-4851-4792-868d-23a18751df58/service-ca-operator/0.log" Apr 16 17:11:22.943176 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:22.943106 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-bfc587fb7-stqj4_e23d01eb-8ff6-40f3-a05d-347fdc6d12b3/service-ca-controller/0.log" Apr 16 17:11:23.448381 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:23.448353 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-xzx59_6ee9ccfb-aa99-4334-93d4-21dad3568cda/csi-driver/0.log" Apr 16 17:11:23.468385 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:23.468357 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-xzx59_6ee9ccfb-aa99-4334-93d4-21dad3568cda/csi-node-driver-registrar/0.log" Apr 16 17:11:23.490363 ip-10-0-131-63 kubenswrapper[2568]: I0416 17:11:23.490340 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-xzx59_6ee9ccfb-aa99-4334-93d4-21dad3568cda/csi-liveness-probe/0.log"