Apr 16 08:33:06.460161 ip-10-0-130-41 systemd[1]: Starting Kubernetes Kubelet... Apr 16 08:33:06.961056 ip-10-0-130-41 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 08:33:06.961056 ip-10-0-130-41 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 08:33:06.961056 ip-10-0-130-41 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 08:33:06.961056 ip-10-0-130-41 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 08:33:06.961056 ip-10-0-130-41 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 08:33:06.965509 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.965421 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 08:33:06.970664 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970648 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 08:33:06.970664 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970665 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 08:33:06.970731 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970669 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 08:33:06.970731 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970672 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 08:33:06.970731 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970675 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 08:33:06.970731 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970678 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 08:33:06.970731 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970680 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 08:33:06.970731 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970683 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 08:33:06.970731 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970686 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 08:33:06.970731 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970688 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 08:33:06.970731 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970691 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 08:33:06.970731 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970694 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 08:33:06.970731 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970696 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 08:33:06.970731 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970699 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 08:33:06.970731 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970701 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 08:33:06.970731 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970703 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 08:33:06.970731 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970706 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 08:33:06.970731 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970711 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 08:33:06.970731 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970713 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 08:33:06.970731 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970716 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 08:33:06.970731 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970718 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 08:33:06.970731 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970721 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 08:33:06.971228 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970724 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 08:33:06.971228 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970726 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 08:33:06.971228 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970728 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 08:33:06.971228 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970732 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 08:33:06.971228 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970735 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 08:33:06.971228 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970737 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 08:33:06.971228 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970740 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 08:33:06.971228 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970743 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 08:33:06.971228 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970746 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 08:33:06.971228 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970749 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 08:33:06.971228 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970751 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 08:33:06.971228 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970753 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 08:33:06.971228 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970756 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 08:33:06.971228 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970759 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 08:33:06.971228 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970762 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 08:33:06.971228 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970764 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 08:33:06.971228 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970767 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 08:33:06.971228 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970769 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 08:33:06.971228 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970771 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 08:33:06.971228 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970774 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 08:33:06.971808 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970776 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 08:33:06.971808 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970779 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 08:33:06.971808 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970782 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 08:33:06.971808 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970784 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 08:33:06.971808 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970788 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 08:33:06.971808 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970790 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 08:33:06.971808 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970793 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 08:33:06.971808 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970795 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 08:33:06.971808 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970798 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 08:33:06.971808 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970800 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 08:33:06.971808 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970803 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 08:33:06.971808 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970805 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 08:33:06.971808 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970808 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 08:33:06.971808 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970811 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 08:33:06.971808 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970813 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 08:33:06.971808 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970816 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 08:33:06.971808 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970819 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 08:33:06.971808 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970821 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 08:33:06.971808 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970824 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 08:33:06.971808 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970826 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 08:33:06.972309 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970829 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 08:33:06.972309 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970831 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 08:33:06.972309 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970833 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 08:33:06.972309 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970836 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 08:33:06.972309 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970838 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 08:33:06.972309 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970841 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 08:33:06.972309 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970843 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 08:33:06.972309 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970847 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 08:33:06.972309 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970851 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 08:33:06.972309 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970854 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 08:33:06.972309 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970856 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 08:33:06.972309 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970859 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 08:33:06.972309 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970861 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 08:33:06.972309 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970863 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 08:33:06.972309 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970866 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 08:33:06.972309 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970868 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 08:33:06.972309 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970870 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 08:33:06.972309 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970873 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 08:33:06.972309 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970875 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 08:33:06.972309 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970878 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 08:33:06.972780 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970880 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 08:33:06.972780 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970883 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 08:33:06.972780 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970898 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 08:33:06.972780 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.970903 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 08:33:06.972780 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971290 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 08:33:06.972780 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971296 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 08:33:06.972780 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971300 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 08:33:06.972780 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971303 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 08:33:06.972780 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971307 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 08:33:06.972780 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971310 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 08:33:06.972780 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971313 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 08:33:06.972780 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971315 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 08:33:06.972780 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971318 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 08:33:06.972780 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971320 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 08:33:06.972780 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971323 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 08:33:06.972780 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971325 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 08:33:06.972780 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971327 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 08:33:06.972780 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971330 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 08:33:06.972780 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971333 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 08:33:06.973258 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971335 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 08:33:06.973258 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971338 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 08:33:06.973258 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971340 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 08:33:06.973258 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971342 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 08:33:06.973258 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971345 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 08:33:06.973258 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971347 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 08:33:06.973258 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971350 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 08:33:06.973258 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971353 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 08:33:06.973258 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971355 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 08:33:06.973258 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971358 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 08:33:06.973258 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971362 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 08:33:06.973258 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971365 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 08:33:06.973258 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971367 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 08:33:06.973258 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971370 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 08:33:06.973258 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971373 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 08:33:06.973258 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971375 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 08:33:06.973258 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971378 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 08:33:06.973258 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971380 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 08:33:06.973258 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971383 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 08:33:06.973729 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971387 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 08:33:06.973729 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971390 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 08:33:06.973729 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971393 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 08:33:06.973729 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971396 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 08:33:06.973729 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971399 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 08:33:06.973729 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971401 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 08:33:06.973729 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971403 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 08:33:06.973729 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971406 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 08:33:06.973729 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971408 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 08:33:06.973729 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971412 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 08:33:06.973729 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971415 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 08:33:06.973729 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971417 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 08:33:06.973729 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971420 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 08:33:06.973729 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971422 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 08:33:06.973729 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971424 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 08:33:06.973729 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971427 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 08:33:06.973729 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971429 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 08:33:06.973729 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971431 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 08:33:06.973729 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971434 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 08:33:06.974208 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971436 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 08:33:06.974208 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971440 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 08:33:06.974208 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971442 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 08:33:06.974208 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971445 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 08:33:06.974208 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971448 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 08:33:06.974208 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971450 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 08:33:06.974208 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971452 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 08:33:06.974208 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971455 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 08:33:06.974208 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971458 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 08:33:06.974208 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971461 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 08:33:06.974208 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971463 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 08:33:06.974208 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971466 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 08:33:06.974208 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971469 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 08:33:06.974208 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971471 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 08:33:06.974208 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971474 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 08:33:06.974208 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971476 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 08:33:06.974208 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971494 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 08:33:06.974208 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971498 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 08:33:06.974208 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971502 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 08:33:06.974208 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971505 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 08:33:06.974684 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971508 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 08:33:06.974684 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971510 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 08:33:06.974684 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971515 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 08:33:06.974684 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971518 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 08:33:06.974684 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971521 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 08:33:06.974684 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971524 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 08:33:06.974684 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971526 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 08:33:06.974684 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971529 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 08:33:06.974684 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971532 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 08:33:06.974684 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971534 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 08:33:06.974684 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971536 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 08:33:06.974684 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971539 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 08:33:06.974684 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.971542 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 08:33:06.974684 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972262 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 08:33:06.974684 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972271 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 08:33:06.974684 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972278 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 08:33:06.974684 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972282 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 08:33:06.974684 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972287 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 08:33:06.974684 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972290 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 08:33:06.974684 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972294 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 08:33:06.974684 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972299 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972302 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972306 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972309 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972313 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972316 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972319 2579 flags.go:64] FLAG: --cgroup-root="" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972322 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972325 2579 flags.go:64] FLAG: --client-ca-file="" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972328 2579 flags.go:64] FLAG: --cloud-config="" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972331 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972334 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972339 2579 flags.go:64] FLAG: --cluster-domain="" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972341 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972345 2579 flags.go:64] FLAG: --config-dir="" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972347 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972351 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972355 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972358 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972361 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972364 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972367 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972370 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972374 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972377 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 08:33:06.975231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972380 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972384 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972387 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972390 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972393 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972397 2579 flags.go:64] FLAG: --enable-server="true" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972400 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972404 2579 flags.go:64] FLAG: --event-burst="100" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972407 2579 flags.go:64] FLAG: --event-qps="50" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972410 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972414 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972417 2579 flags.go:64] FLAG: --eviction-hard="" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972421 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972423 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972427 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972430 2579 flags.go:64] FLAG: --eviction-soft="" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972432 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972435 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972438 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972441 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972444 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972446 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972449 2579 flags.go:64] FLAG: --feature-gates="" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972452 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972455 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 08:33:06.975821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972459 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972462 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972464 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972467 2579 flags.go:64] FLAG: --help="false" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972470 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-130-41.ec2.internal" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972473 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972477 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972479 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972483 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972486 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972489 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972493 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972495 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972498 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972501 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972504 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972507 2579 flags.go:64] FLAG: --kube-reserved="" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972510 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972513 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972516 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972519 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972522 2579 flags.go:64] FLAG: --lock-file="" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972524 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972527 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 08:33:06.976453 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972530 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972535 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972539 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972541 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972544 2579 flags.go:64] FLAG: --logging-format="text" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972547 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972550 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972553 2579 flags.go:64] FLAG: --manifest-url="" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972556 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972560 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972563 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972571 2579 flags.go:64] FLAG: --max-pods="110" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972574 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972577 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972580 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972583 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972586 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972589 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972592 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972599 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972602 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972605 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972608 2579 flags.go:64] FLAG: --pod-cidr="" Apr 16 08:33:06.977057 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972611 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972616 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972619 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972622 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972625 2579 flags.go:64] FLAG: --port="10250" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972628 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972631 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-016dd26aa330de760" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972634 2579 flags.go:64] FLAG: --qos-reserved="" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972637 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972639 2579 flags.go:64] FLAG: --register-node="true" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972642 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972645 2579 flags.go:64] FLAG: --register-with-taints="" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972649 2579 flags.go:64] FLAG: --registry-burst="10" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972652 2579 flags.go:64] FLAG: --registry-qps="5" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972655 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972657 2579 flags.go:64] FLAG: --reserved-memory="" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972661 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972664 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972667 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972669 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972672 2579 flags.go:64] FLAG: --runonce="false" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972676 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972679 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972682 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972685 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972688 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 08:33:06.977605 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972691 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972694 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972697 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972700 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972703 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972705 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972708 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972711 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972715 2579 flags.go:64] FLAG: --system-cgroups="" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972718 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972724 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972727 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972730 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972735 2579 flags.go:64] FLAG: --tls-min-version="" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972737 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972740 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972743 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972746 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972749 2579 flags.go:64] FLAG: --v="2" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972752 2579 flags.go:64] FLAG: --version="false" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972756 2579 flags.go:64] FLAG: --vmodule="" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972766 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.972769 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972859 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972863 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 08:33:06.978307 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972866 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 08:33:06.978909 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972869 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 08:33:06.978909 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972872 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 08:33:06.978909 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972875 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 08:33:06.978909 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972877 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 08:33:06.978909 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972879 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 08:33:06.978909 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972882 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 08:33:06.978909 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972884 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 08:33:06.978909 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972901 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 08:33:06.978909 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972904 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 08:33:06.978909 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972907 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 08:33:06.978909 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972909 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 08:33:06.978909 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972912 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 08:33:06.978909 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972914 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 08:33:06.978909 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972917 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 08:33:06.978909 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972921 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 08:33:06.978909 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972923 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 08:33:06.978909 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972926 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 08:33:06.978909 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972929 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 08:33:06.978909 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972932 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 08:33:06.978909 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972934 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 08:33:06.979424 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972937 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 08:33:06.979424 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972939 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 08:33:06.979424 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972942 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 08:33:06.979424 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972944 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 08:33:06.979424 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972947 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 08:33:06.979424 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972949 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 08:33:06.979424 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972952 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 08:33:06.979424 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972965 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 08:33:06.979424 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972969 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 08:33:06.979424 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972972 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 08:33:06.979424 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972975 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 08:33:06.979424 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972978 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 08:33:06.979424 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972981 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 08:33:06.979424 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972984 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 08:33:06.979424 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972987 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 08:33:06.979424 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972990 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 08:33:06.979424 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972992 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 08:33:06.979424 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972994 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 08:33:06.979424 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972997 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 08:33:06.979896 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.972999 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 08:33:06.979896 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973002 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 08:33:06.979896 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973004 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 08:33:06.979896 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973007 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 08:33:06.979896 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973009 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 08:33:06.979896 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973012 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 08:33:06.979896 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973014 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 08:33:06.979896 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973018 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 08:33:06.979896 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973021 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 08:33:06.979896 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973023 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 08:33:06.979896 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973026 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 08:33:06.979896 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973028 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 08:33:06.979896 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973031 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 08:33:06.979896 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973034 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 08:33:06.979896 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973037 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 08:33:06.979896 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973039 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 08:33:06.979896 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973042 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 08:33:06.979896 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973044 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 08:33:06.979896 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973047 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 08:33:06.979896 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973049 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 08:33:06.980435 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973051 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 08:33:06.980435 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973054 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 08:33:06.980435 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973056 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 08:33:06.980435 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973059 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 08:33:06.980435 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973061 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 08:33:06.980435 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973064 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 08:33:06.980435 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973066 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 08:33:06.980435 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973069 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 08:33:06.980435 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973072 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 08:33:06.980435 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973074 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 08:33:06.980435 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973077 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 08:33:06.980435 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973079 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 08:33:06.980435 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973082 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 08:33:06.980435 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973084 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 08:33:06.980435 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973087 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 08:33:06.980435 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973090 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 08:33:06.980435 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973092 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 08:33:06.980435 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973095 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 08:33:06.980435 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973099 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 08:33:06.981253 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973103 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 08:33:06.981253 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973106 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 08:33:06.981253 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973108 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 08:33:06.981253 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973111 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 08:33:06.981253 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.973113 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 08:33:06.981253 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.974019 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 08:33:06.983172 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.983153 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 08:33:06.983211 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.983174 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 08:33:06.983246 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983234 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 08:33:06.983246 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983240 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 08:33:06.983246 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983243 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 08:33:06.983246 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983247 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 08:33:06.983382 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983250 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 08:33:06.983382 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983254 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 08:33:06.983382 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983257 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 08:33:06.983382 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983260 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 08:33:06.983382 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983263 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 08:33:06.983382 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983267 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 08:33:06.983382 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983269 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 08:33:06.983382 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983272 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 08:33:06.983382 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983275 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 08:33:06.983382 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983278 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 08:33:06.983382 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983282 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 08:33:06.983382 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983287 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 08:33:06.983382 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983290 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 08:33:06.983382 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983293 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 08:33:06.983382 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983296 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 08:33:06.983382 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983299 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 08:33:06.983382 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983302 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 08:33:06.983382 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983304 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 08:33:06.983382 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983307 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 08:33:06.983382 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983310 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 08:33:06.983872 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983312 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 08:33:06.983872 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983315 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 08:33:06.983872 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983318 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 08:33:06.983872 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983321 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 08:33:06.983872 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983323 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 08:33:06.983872 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983326 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 08:33:06.983872 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983328 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 08:33:06.983872 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983331 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 08:33:06.983872 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983333 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 08:33:06.983872 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983335 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 08:33:06.983872 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983338 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 08:33:06.983872 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983341 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 08:33:06.983872 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983343 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 08:33:06.983872 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983346 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 08:33:06.983872 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983349 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 08:33:06.983872 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983351 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 08:33:06.983872 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983354 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 08:33:06.983872 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983357 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 08:33:06.983872 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983360 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 08:33:06.983872 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983363 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 08:33:06.984386 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983365 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 08:33:06.984386 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983368 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 08:33:06.984386 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983370 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 08:33:06.984386 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983373 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 08:33:06.984386 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983375 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 08:33:06.984386 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983378 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 08:33:06.984386 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983380 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 08:33:06.984386 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983383 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 08:33:06.984386 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983386 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 08:33:06.984386 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983389 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 08:33:06.984386 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983391 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 08:33:06.984386 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983394 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 08:33:06.984386 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983397 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 08:33:06.984386 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983399 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 08:33:06.984386 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983401 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 08:33:06.984386 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983404 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 08:33:06.984386 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983406 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 08:33:06.984386 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983410 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 08:33:06.984386 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983413 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 08:33:06.984833 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983416 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 08:33:06.984833 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983418 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 08:33:06.984833 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983421 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 08:33:06.984833 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983423 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 08:33:06.984833 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983426 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 08:33:06.984833 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983429 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 08:33:06.984833 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983431 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 08:33:06.984833 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983434 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 08:33:06.984833 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983436 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 08:33:06.984833 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983439 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 08:33:06.984833 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983441 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 08:33:06.984833 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983444 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 08:33:06.984833 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983446 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 08:33:06.984833 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983449 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 08:33:06.984833 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983451 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 08:33:06.984833 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983454 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 08:33:06.984833 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983457 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 08:33:06.984833 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983459 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 08:33:06.984833 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983462 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 08:33:06.984833 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983464 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 08:33:06.985353 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983466 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 08:33:06.985353 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983469 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 08:33:06.985353 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983472 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 08:33:06.985353 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.983477 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 08:33:06.985353 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983573 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 08:33:06.985353 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983578 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 08:33:06.985353 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983580 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 08:33:06.985353 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983583 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 08:33:06.985353 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983586 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 08:33:06.985353 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983588 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 08:33:06.985353 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983593 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 08:33:06.985353 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983596 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 08:33:06.985353 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983599 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 08:33:06.985353 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983603 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 08:33:06.985353 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983605 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 08:33:06.985728 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983608 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 08:33:06.985728 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983610 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 08:33:06.985728 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983613 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 08:33:06.985728 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983617 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 08:33:06.985728 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983619 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 08:33:06.985728 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983622 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 08:33:06.985728 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983624 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 08:33:06.985728 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983627 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 08:33:06.985728 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983629 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 08:33:06.985728 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983632 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 08:33:06.985728 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983635 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 08:33:06.985728 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983638 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 08:33:06.985728 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983640 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 08:33:06.985728 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983643 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 08:33:06.985728 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983645 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 08:33:06.985728 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983648 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 08:33:06.985728 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983650 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 08:33:06.985728 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983653 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 08:33:06.985728 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983655 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 08:33:06.985728 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983658 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 08:33:06.986236 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983660 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 08:33:06.986236 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983663 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 08:33:06.986236 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983665 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 08:33:06.986236 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983667 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 08:33:06.986236 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983670 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 08:33:06.986236 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983672 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 08:33:06.986236 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983675 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 08:33:06.986236 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983677 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 08:33:06.986236 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983680 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 08:33:06.986236 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983683 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 08:33:06.986236 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983685 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 08:33:06.986236 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983688 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 08:33:06.986236 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983690 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 08:33:06.986236 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983693 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 08:33:06.986236 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983695 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 08:33:06.986236 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983698 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 08:33:06.986236 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983701 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 08:33:06.986236 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983705 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 08:33:06.986236 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983708 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 08:33:06.986236 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983710 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 08:33:06.986724 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983713 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 08:33:06.986724 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983715 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 08:33:06.986724 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983718 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 08:33:06.986724 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983720 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 08:33:06.986724 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983724 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 08:33:06.986724 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983726 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 08:33:06.986724 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983728 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 08:33:06.986724 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983731 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 08:33:06.986724 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983733 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 08:33:06.986724 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983736 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 08:33:06.986724 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983738 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 08:33:06.986724 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983741 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 08:33:06.986724 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983744 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 08:33:06.986724 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983746 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 08:33:06.986724 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983748 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 08:33:06.986724 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983751 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 08:33:06.986724 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983753 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 08:33:06.986724 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983756 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 08:33:06.986724 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983759 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 08:33:06.986724 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983761 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 08:33:06.987293 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983764 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 08:33:06.987293 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983766 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 08:33:06.987293 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983768 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 08:33:06.987293 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983771 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 08:33:06.987293 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983773 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 08:33:06.987293 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983776 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 08:33:06.987293 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983778 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 08:33:06.987293 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983781 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 08:33:06.987293 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983783 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 08:33:06.987293 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983786 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 08:33:06.987293 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983788 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 08:33:06.987293 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983791 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 08:33:06.987293 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983793 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 08:33:06.987293 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983796 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 08:33:06.987293 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:06.983798 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 08:33:06.987663 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.983803 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 08:33:06.987663 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.984602 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 08:33:06.988214 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.988201 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 08:33:06.989398 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.989386 2579 server.go:1019] "Starting client certificate rotation" Apr 16 08:33:06.989514 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.989495 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 08:33:06.989549 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:06.989540 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 08:33:07.017499 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.017479 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 08:33:07.021957 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.021922 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 08:33:07.038116 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.038088 2579 log.go:25] "Validated CRI v1 runtime API" Apr 16 08:33:07.044351 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.044334 2579 log.go:25] "Validated CRI v1 image API" Apr 16 08:33:07.045465 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.045441 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 08:33:07.050119 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.050100 2579 fs.go:135] Filesystem UUIDs: map[15d9a79f-c448-41cd-a179-cbc123576ed3:/dev/nvme0n1p3 70ade21c-8904-4c88-9101-20c39bf90848:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 08:33:07.050170 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.050120 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 08:33:07.055603 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.055585 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 08:33:07.055788 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.055684 2579 manager.go:217] Machine: {Timestamp:2026-04-16 08:33:07.053751595 +0000 UTC m=+0.460790552 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100270 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2dfde0f836e1baa9448cd49d961516 SystemUUID:ec2dfde0-f836-e1ba-a944-8cd49d961516 BootID:fae429d2-a719-4f91-a07d-e4af885f2321 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e0:d0:f9:32:39 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e0:d0:f9:32:39 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ca:68:c8:f5:34:46 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 08:33:07.055788 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.055786 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 08:33:07.055881 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.055869 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 08:33:07.056998 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.056977 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 08:33:07.057140 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.057001 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-41.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 08:33:07.057196 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.057151 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 08:33:07.057196 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.057164 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 08:33:07.057196 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.057177 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 08:33:07.058425 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.058414 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 08:33:07.059880 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.059870 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 16 08:33:07.060159 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.060149 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 08:33:07.062707 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.062697 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 16 08:33:07.062746 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.062717 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 08:33:07.062746 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.062729 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 08:33:07.062746 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.062738 2579 kubelet.go:397] "Adding apiserver pod source" Apr 16 08:33:07.062746 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.062746 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 08:33:07.063906 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.063875 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 08:33:07.063906 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.063908 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 08:33:07.069258 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.069240 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 08:33:07.071187 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.071173 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 08:33:07.072520 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.072508 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 08:33:07.072568 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.072525 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 08:33:07.072568 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.072531 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 08:33:07.072568 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.072537 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 08:33:07.072568 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.072542 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 08:33:07.072568 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.072548 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 08:33:07.072568 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.072555 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 08:33:07.072568 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.072561 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 08:33:07.072568 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.072568 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 08:33:07.072802 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.072573 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 08:33:07.072802 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.072588 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 08:33:07.072802 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.072597 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 08:33:07.073512 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.073501 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 08:33:07.073544 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.073513 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 08:33:07.075722 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:07.075673 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 08:33:07.075722 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:07.075676 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-41.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 08:33:07.077247 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.077230 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 08:33:07.077325 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.077266 2579 server.go:1295] "Started kubelet" Apr 16 08:33:07.077325 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.077305 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-41.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 08:33:07.077414 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.077357 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 08:33:07.077785 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.077769 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 08:33:07.077911 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.077864 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 08:33:07.078128 ip-10-0-130-41 systemd[1]: Started Kubernetes Kubelet. Apr 16 08:33:07.079197 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.079176 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 08:33:07.083427 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.083394 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7tq2r" Apr 16 08:33:07.085213 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.085194 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 16 08:33:07.086433 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:07.085286 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-41.ec2.internal.18a6c941b42e807a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-41.ec2.internal,UID:ip-10-0-130-41.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-41.ec2.internal,},FirstTimestamp:2026-04-16 08:33:07.077243002 +0000 UTC m=+0.484281975,LastTimestamp:2026-04-16 08:33:07.077243002 +0000 UTC m=+0.484281975,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-41.ec2.internal,}" Apr 16 08:33:07.090250 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.090181 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7tq2r" Apr 16 08:33:07.090921 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.090871 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 08:33:07.091106 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.091083 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 08:33:07.092161 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:07.092138 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-41.ec2.internal\" not found" Apr 16 08:33:07.093338 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.092342 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 08:33:07.093338 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.092363 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 08:33:07.093338 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.092460 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 08:33:07.093562 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.093544 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 16 08:33:07.093562 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.093559 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 16 08:33:07.094266 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.094249 2579 factory.go:55] Registering systemd factory Apr 16 08:33:07.094366 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.094355 2579 factory.go:223] Registration of the systemd container factory successfully Apr 16 08:33:07.094699 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.094680 2579 factory.go:153] Registering CRI-O factory Apr 16 08:33:07.094778 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.094702 2579 factory.go:223] Registration of the crio container factory successfully Apr 16 08:33:07.094778 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.094767 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 08:33:07.094879 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.094791 2579 factory.go:103] Registering Raw factory Apr 16 08:33:07.094879 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.094806 2579 manager.go:1196] Started watching for new ooms in manager Apr 16 08:33:07.096334 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.096319 2579 manager.go:319] Starting recovery of all containers Apr 16 08:33:07.097050 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:07.097027 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 08:33:07.101259 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.101238 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:33:07.105434 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.105419 2579 manager.go:324] Recovery completed Apr 16 08:33:07.105511 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:07.105490 2579 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-41.ec2.internal\" not found" node="ip-10-0-130-41.ec2.internal" Apr 16 08:33:07.109820 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.109801 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 08:33:07.112342 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.112328 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-41.ec2.internal" event="NodeHasSufficientMemory" Apr 16 08:33:07.112413 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.112358 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-41.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 08:33:07.112413 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.112371 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-41.ec2.internal" event="NodeHasSufficientPID" Apr 16 08:33:07.112862 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.112845 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 08:33:07.112862 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.112860 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 08:33:07.113032 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.112880 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 16 08:33:07.115513 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.115499 2579 policy_none.go:49] "None policy: Start" Apr 16 08:33:07.115608 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.115519 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 08:33:07.115608 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.115532 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 16 08:33:07.155506 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.155484 2579 manager.go:341] "Starting Device Plugin manager" Apr 16 08:33:07.162705 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:07.155521 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 08:33:07.162705 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.155535 2579 server.go:85] "Starting device plugin registration server" Apr 16 08:33:07.162705 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.155750 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 08:33:07.162705 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.155761 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 08:33:07.162705 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.155845 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 08:33:07.162705 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.155935 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 08:33:07.162705 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.155944 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 08:33:07.162705 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:07.156507 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 08:33:07.162705 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:07.156549 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-41.ec2.internal\" not found" Apr 16 08:33:07.182931 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.182905 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 08:33:07.184233 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.184218 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 08:33:07.184275 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.184248 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 08:33:07.184275 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.184270 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 08:33:07.184340 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.184279 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 08:33:07.184340 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:07.184316 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 08:33:07.187729 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.187709 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:33:07.255977 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.255921 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 08:33:07.256995 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.256979 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-41.ec2.internal" event="NodeHasSufficientMemory" Apr 16 08:33:07.257076 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.257009 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-41.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 08:33:07.257076 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.257020 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-41.ec2.internal" event="NodeHasSufficientPID" Apr 16 08:33:07.257076 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.257046 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-41.ec2.internal" Apr 16 08:33:07.265243 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.265229 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-41.ec2.internal" Apr 16 08:33:07.265289 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:07.265251 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-41.ec2.internal\": node \"ip-10-0-130-41.ec2.internal\" not found" Apr 16 08:33:07.276684 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:07.276663 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-41.ec2.internal\" not found" Apr 16 08:33:07.284391 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.284370 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-41.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-41.ec2.internal"] Apr 16 08:33:07.284463 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.284434 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 08:33:07.285196 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.285179 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-41.ec2.internal" event="NodeHasSufficientMemory" Apr 16 08:33:07.285264 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.285208 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-41.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 08:33:07.285264 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.285218 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-41.ec2.internal" event="NodeHasSufficientPID" Apr 16 08:33:07.286506 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.286493 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 08:33:07.286656 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.286644 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-41.ec2.internal" Apr 16 08:33:07.286707 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.286672 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 08:33:07.287118 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.287105 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-41.ec2.internal" event="NodeHasSufficientMemory" Apr 16 08:33:07.287190 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.287121 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-41.ec2.internal" event="NodeHasSufficientMemory" Apr 16 08:33:07.287190 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.287133 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-41.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 08:33:07.287190 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.287138 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-41.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 08:33:07.287190 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.287145 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-41.ec2.internal" event="NodeHasSufficientPID" Apr 16 08:33:07.287190 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.287147 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-41.ec2.internal" event="NodeHasSufficientPID" Apr 16 08:33:07.288916 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.288880 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-41.ec2.internal" Apr 16 08:33:07.288968 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.288925 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 08:33:07.289642 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.289627 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-41.ec2.internal" event="NodeHasSufficientMemory" Apr 16 08:33:07.289699 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.289660 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-41.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 08:33:07.289699 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.289676 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-41.ec2.internal" event="NodeHasSufficientPID" Apr 16 08:33:07.294000 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.293977 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bb5cb6fea81fd2041eddb8622fb55be0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-41.ec2.internal\" (UID: \"bb5cb6fea81fd2041eddb8622fb55be0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-41.ec2.internal" Apr 16 08:33:07.294079 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.294010 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb5cb6fea81fd2041eddb8622fb55be0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-41.ec2.internal\" (UID: \"bb5cb6fea81fd2041eddb8622fb55be0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-41.ec2.internal" Apr 16 08:33:07.294079 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.294028 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2c43aaa946be06bbbb13a479a93166e2-config\") pod \"kube-apiserver-proxy-ip-10-0-130-41.ec2.internal\" (UID: \"2c43aaa946be06bbbb13a479a93166e2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-41.ec2.internal" Apr 16 08:33:07.322514 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:07.322495 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-41.ec2.internal\" not found" node="ip-10-0-130-41.ec2.internal" Apr 16 08:33:07.326870 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:07.326853 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-41.ec2.internal\" not found" node="ip-10-0-130-41.ec2.internal" Apr 16 08:33:07.377469 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:07.377452 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-41.ec2.internal\" not found" Apr 16 08:33:07.394259 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.394242 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2c43aaa946be06bbbb13a479a93166e2-config\") pod \"kube-apiserver-proxy-ip-10-0-130-41.ec2.internal\" (UID: \"2c43aaa946be06bbbb13a479a93166e2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-41.ec2.internal" Apr 16 08:33:07.394343 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.394272 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bb5cb6fea81fd2041eddb8622fb55be0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-41.ec2.internal\" (UID: \"bb5cb6fea81fd2041eddb8622fb55be0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-41.ec2.internal" Apr 16 08:33:07.394343 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.394297 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb5cb6fea81fd2041eddb8622fb55be0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-41.ec2.internal\" (UID: \"bb5cb6fea81fd2041eddb8622fb55be0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-41.ec2.internal" Apr 16 08:33:07.394343 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.394335 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb5cb6fea81fd2041eddb8622fb55be0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-41.ec2.internal\" (UID: \"bb5cb6fea81fd2041eddb8622fb55be0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-41.ec2.internal" Apr 16 08:33:07.394439 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.394339 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2c43aaa946be06bbbb13a479a93166e2-config\") pod \"kube-apiserver-proxy-ip-10-0-130-41.ec2.internal\" (UID: \"2c43aaa946be06bbbb13a479a93166e2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-41.ec2.internal" Apr 16 08:33:07.394439 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.394350 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bb5cb6fea81fd2041eddb8622fb55be0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-41.ec2.internal\" (UID: \"bb5cb6fea81fd2041eddb8622fb55be0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-41.ec2.internal" Apr 16 08:33:07.478476 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:07.478449 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-41.ec2.internal\" not found" Apr 16 08:33:07.579280 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:07.579223 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-41.ec2.internal\" not found" Apr 16 08:33:07.625872 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.625835 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-41.ec2.internal" Apr 16 08:33:07.629697 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.629684 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-41.ec2.internal" Apr 16 08:33:07.680274 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:07.680244 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-41.ec2.internal\" not found" Apr 16 08:33:07.780840 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:07.780818 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-41.ec2.internal\" not found" Apr 16 08:33:07.881448 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:07.881375 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-41.ec2.internal\" not found" Apr 16 08:33:07.982057 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:07.982024 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-41.ec2.internal\" not found" Apr 16 08:33:07.990183 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.990162 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 08:33:07.990316 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.990296 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 08:33:07.990355 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:07.990337 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 08:33:08.082260 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:08.082232 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-41.ec2.internal\" not found" Apr 16 08:33:08.092171 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:08.092151 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 08:33:08.093403 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:08.093368 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 08:28:07 +0000 UTC" deadline="2027-12-13 10:13:35.150258118 +0000 UTC" Apr 16 08:33:08.093471 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:08.093403 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14545h40m27.056858213s" Apr 16 08:33:08.105847 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:08.105789 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 08:33:08.131884 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:08.131832 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-789xh" Apr 16 08:33:08.140410 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:08.140391 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-789xh" Apr 16 08:33:08.157448 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:08.157417 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c43aaa946be06bbbb13a479a93166e2.slice/crio-859eb3432ba0e356fcdaf6586521b7eaabd8e315297d9482324742a2d05b7d2a WatchSource:0}: Error finding container 859eb3432ba0e356fcdaf6586521b7eaabd8e315297d9482324742a2d05b7d2a: Status 404 returned error can't find the container with id 859eb3432ba0e356fcdaf6586521b7eaabd8e315297d9482324742a2d05b7d2a Apr 16 08:33:08.157719 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:08.157699 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb5cb6fea81fd2041eddb8622fb55be0.slice/crio-123540de6b6747412b0490066541ec4b6881d28f7bb6d9393887119ceaf2f2c3 WatchSource:0}: Error finding container 123540de6b6747412b0490066541ec4b6881d28f7bb6d9393887119ceaf2f2c3: Status 404 returned error can't find the container with id 123540de6b6747412b0490066541ec4b6881d28f7bb6d9393887119ceaf2f2c3 Apr 16 08:33:08.162601 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:08.162586 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 08:33:08.183367 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:08.183337 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-41.ec2.internal\" not found" Apr 16 08:33:08.187070 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:08.187019 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-41.ec2.internal" event={"ID":"2c43aaa946be06bbbb13a479a93166e2","Type":"ContainerStarted","Data":"859eb3432ba0e356fcdaf6586521b7eaabd8e315297d9482324742a2d05b7d2a"} Apr 16 08:33:08.188040 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:08.188017 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-41.ec2.internal" event={"ID":"bb5cb6fea81fd2041eddb8622fb55be0","Type":"ContainerStarted","Data":"123540de6b6747412b0490066541ec4b6881d28f7bb6d9393887119ceaf2f2c3"} Apr 16 08:33:08.283596 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:08.283569 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-41.ec2.internal\" not found" Apr 16 08:33:08.372781 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:08.372754 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:33:08.383946 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:08.383876 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-41.ec2.internal\" not found" Apr 16 08:33:08.484473 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:08.484440 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-41.ec2.internal\" not found" Apr 16 08:33:08.585402 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:08.585374 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-41.ec2.internal\" not found" Apr 16 08:33:08.621220 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:08.621189 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:33:08.692169 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:08.692097 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-41.ec2.internal" Apr 16 08:33:08.704258 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:08.704231 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 08:33:08.705503 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:08.705483 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-41.ec2.internal" Apr 16 08:33:08.713158 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:08.713117 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 08:33:08.865314 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:08.865283 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:33:09.064689 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.064665 2579 apiserver.go:52] "Watching apiserver" Apr 16 08:33:09.075510 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.075482 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 08:33:09.076542 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.076516 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-130-41.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-41.ec2.internal","openshift-multus/multus-mkvns","openshift-network-diagnostics/network-check-target-ntc92","openshift-network-operator/iptables-alerter-fpxkh","openshift-ovn-kubernetes/ovnkube-node-m577d","kube-system/konnectivity-agent-j7rrw","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc","openshift-cluster-node-tuning-operator/tuned-t6g8t","openshift-dns/node-resolver-6fwv8","openshift-image-registry/node-ca-hsx9c","openshift-multus/multus-additional-cni-plugins-6cdbx","openshift-multus/network-metrics-daemon-vdxz4"] Apr 16 08:33:09.079142 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.079114 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-j7rrw" Apr 16 08:33:09.080362 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.080342 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.081427 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.081408 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 08:33:09.081531 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.081436 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-q4ckz\"" Apr 16 08:33:09.081589 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.081567 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:09.081664 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:09.081645 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntc92" podUID="3b6d17c9-2d51-47bc-9e36-95fb034872cb" Apr 16 08:33:09.081718 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.081670 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 08:33:09.082661 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.082639 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 08:33:09.082661 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.082658 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qtgdf\"" Apr 16 08:33:09.082778 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.082686 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 08:33:09.082778 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.082774 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 08:33:09.085730 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.082961 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 08:33:09.085730 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.083022 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fpxkh" Apr 16 08:33:09.085730 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.083156 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.085730 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.084763 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.085730 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.085303 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 08:33:09.085730 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.085309 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 08:33:09.085730 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.085428 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jk5tm\"" Apr 16 08:33:09.086158 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.086138 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 08:33:09.086267 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.086138 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 08:33:09.086494 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.086478 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.088120 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.088099 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6fwv8" Apr 16 08:33:09.088515 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.088495 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 08:33:09.088656 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.088553 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 08:33:09.088656 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.088496 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 08:33:09.089465 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.088784 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 08:33:09.089465 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.088815 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sccxv\"" Apr 16 08:33:09.089465 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.088988 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 08:33:09.089465 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.089007 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 08:33:09.089465 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.089059 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 08:33:09.089465 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.089060 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 08:33:09.089465 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.089238 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 08:33:09.089465 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.089247 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-d6lsm\"" Apr 16 08:33:09.089465 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.089359 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-j6zpl\"" Apr 16 08:33:09.089796 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.089597 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 08:33:09.090816 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.090385 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 08:33:09.090816 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.090477 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 08:33:09.090816 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.090548 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hsx9c" Apr 16 08:33:09.090816 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.090605 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-j754g\"" Apr 16 08:33:09.092586 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.092546 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 08:33:09.092709 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.092694 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 08:33:09.092814 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.092800 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 08:33:09.093045 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.093018 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-dffzl\"" Apr 16 08:33:09.094521 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.093641 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.095796 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.095775 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:09.095883 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:09.095852 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vdxz4" podUID="53e35ca8-ec77-48b7-8e96-ae73f7083c85" Apr 16 08:33:09.096193 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.096171 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 08:33:09.096656 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.096636 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 08:33:09.096843 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.096827 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-6bddg\"" Apr 16 08:33:09.104463 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.104443 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-cnibin\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.104543 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.104472 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0508d934-e625-43c2-940e-b3bcd7809d6d-registration-dir\") pod \"aws-ebs-csi-driver-node-4mslc\" (UID: \"0508d934-e625-43c2-940e-b3bcd7809d6d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.104543 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.104495 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-host-kubelet\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.104543 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.104526 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-node-log\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.104679 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.104566 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-var-lib-kubelet\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.104679 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.104602 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-etc-kubernetes\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.104679 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.104641 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2dd4e090-3d55-499c-a3fb-7a04e930e31c-host\") pod \"node-ca-hsx9c\" (UID: \"2dd4e090-3d55-499c-a3fb-7a04e930e31c\") " pod="openshift-image-registry/node-ca-hsx9c" Apr 16 08:33:09.104679 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.104658 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-ovnkube-script-lib\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.104818 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.104682 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3ab93a09-2404-44a9-8381-13976f5a1595-agent-certs\") pod \"konnectivity-agent-j7rrw\" (UID: \"3ab93a09-2404-44a9-8381-13976f5a1595\") " pod="kube-system/konnectivity-agent-j7rrw" Apr 16 08:33:09.104818 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.104706 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-run-systemd\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.104818 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.104738 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-var-lib-openvswitch\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.104818 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.104753 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-host-run-netns\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.105009 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.104878 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ed4414d4-f963-4c14-87d5-738798aeb287-multus-daemon-config\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.105009 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.104928 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdvsk\" (UniqueName: \"kubernetes.io/projected/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-kube-api-access-cdvsk\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.105009 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.104962 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7886z\" (UniqueName: \"kubernetes.io/projected/3b6d17c9-2d51-47bc-9e36-95fb034872cb-kube-api-access-7886z\") pod \"network-check-target-ntc92\" (UID: \"3b6d17c9-2d51-47bc-9e36-95fb034872cb\") " pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:09.105009 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.104985 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkk6b\" (UniqueName: \"kubernetes.io/projected/da769e7e-5234-4962-b0d5-107292fb0b9f-kube-api-access-bkk6b\") pod \"iptables-alerter-fpxkh\" (UID: \"da769e7e-5234-4962-b0d5-107292fb0b9f\") " pod="openshift-network-operator/iptables-alerter-fpxkh" Apr 16 08:33:09.105185 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105008 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-host-run-netns\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.105185 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105033 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-host-cni-netd\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.105185 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105056 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-etc-sysctl-conf\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.105185 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105116 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da769e7e-5234-4962-b0d5-107292fb0b9f-host-slash\") pod \"iptables-alerter-fpxkh\" (UID: \"da769e7e-5234-4962-b0d5-107292fb0b9f\") " pod="openshift-network-operator/iptables-alerter-fpxkh" Apr 16 08:33:09.105185 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105150 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-env-overrides\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.105185 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105174 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-multus-conf-dir\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.105485 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105197 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bzg6\" (UniqueName: \"kubernetes.io/projected/ed4414d4-f963-4c14-87d5-738798aeb287-kube-api-access-6bzg6\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.105485 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105223 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0508d934-e625-43c2-940e-b3bcd7809d6d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4mslc\" (UID: \"0508d934-e625-43c2-940e-b3bcd7809d6d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.105485 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105308 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0508d934-e625-43c2-940e-b3bcd7809d6d-socket-dir\") pod \"aws-ebs-csi-driver-node-4mslc\" (UID: \"0508d934-e625-43c2-940e-b3bcd7809d6d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.105485 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105331 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx746\" (UniqueName: \"kubernetes.io/projected/9ab7948f-df7f-4fae-ad8e-a2cab21c427e-kube-api-access-rx746\") pod \"node-resolver-6fwv8\" (UID: \"9ab7948f-df7f-4fae-ad8e-a2cab21c427e\") " pod="openshift-dns/node-resolver-6fwv8" Apr 16 08:33:09.105485 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105357 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdfrf\" (UniqueName: \"kubernetes.io/projected/2dd4e090-3d55-499c-a3fb-7a04e930e31c-kube-api-access-bdfrf\") pod \"node-ca-hsx9c\" (UID: \"2dd4e090-3d55-499c-a3fb-7a04e930e31c\") " pod="openshift-image-registry/node-ca-hsx9c" Apr 16 08:33:09.105485 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105380 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/da769e7e-5234-4962-b0d5-107292fb0b9f-iptables-alerter-script\") pod \"iptables-alerter-fpxkh\" (UID: \"da769e7e-5234-4962-b0d5-107292fb0b9f\") " pod="openshift-network-operator/iptables-alerter-fpxkh" Apr 16 08:33:09.105485 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105404 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-log-socket\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.105485 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105430 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-etc-sysctl-d\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.105485 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105453 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7330bfa4-019f-4fdd-bc61-5919a528f3e1-cni-binary-copy\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.105485 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105482 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7lm7\" (UniqueName: \"kubernetes.io/projected/7330bfa4-019f-4fdd-bc61-5919a528f3e1-kube-api-access-f7lm7\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.105933 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105505 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-host-var-lib-kubelet\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.105933 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105528 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2dd4e090-3d55-499c-a3fb-7a04e930e31c-serviceca\") pod \"node-ca-hsx9c\" (UID: \"2dd4e090-3d55-499c-a3fb-7a04e930e31c\") " pod="openshift-image-registry/node-ca-hsx9c" Apr 16 08:33:09.105933 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105584 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-system-cni-dir\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.105933 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105625 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9ab7948f-df7f-4fae-ad8e-a2cab21c427e-tmp-dir\") pod \"node-resolver-6fwv8\" (UID: \"9ab7948f-df7f-4fae-ad8e-a2cab21c427e\") " pod="openshift-dns/node-resolver-6fwv8" Apr 16 08:33:09.105933 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105659 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3ab93a09-2404-44a9-8381-13976f5a1595-konnectivity-ca\") pod \"konnectivity-agent-j7rrw\" (UID: \"3ab93a09-2404-44a9-8381-13976f5a1595\") " pod="kube-system/konnectivity-agent-j7rrw" Apr 16 08:33:09.105933 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105683 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-etc-sysconfig\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.105933 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105707 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7330bfa4-019f-4fdd-bc61-5919a528f3e1-cnibin\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.105933 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105730 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-os-release\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.105933 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105777 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-host-var-lib-cni-bin\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.105933 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105803 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-host-var-lib-cni-multus\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.105933 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105850 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0508d934-e625-43c2-940e-b3bcd7809d6d-sys-fs\") pod \"aws-ebs-csi-driver-node-4mslc\" (UID: \"0508d934-e625-43c2-940e-b3bcd7809d6d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.105933 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105872 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-host-slash\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.105933 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105910 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-etc-modprobe-d\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.105933 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105933 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7330bfa4-019f-4fdd-bc61-5919a528f3e1-system-cni-dir\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.106464 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105948 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7330bfa4-019f-4fdd-bc61-5919a528f3e1-os-release\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.106464 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105961 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-multus-socket-dir-parent\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.106464 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.105975 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-lib-modules\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.106464 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106023 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ed4414d4-f963-4c14-87d5-738798aeb287-cni-binary-copy\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.106464 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106051 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-ovn-node-metrics-cert\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.106464 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106072 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.106464 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106096 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-host-run-k8s-cni-cncf-io\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.106464 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106118 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-hostroot\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.106464 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106136 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfmwq\" (UniqueName: \"kubernetes.io/projected/0508d934-e625-43c2-940e-b3bcd7809d6d-kube-api-access-hfmwq\") pod \"aws-ebs-csi-driver-node-4mslc\" (UID: \"0508d934-e625-43c2-940e-b3bcd7809d6d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.106464 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106158 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-multus-cni-dir\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.106464 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106178 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-ovnkube-config\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.106464 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106197 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-host-run-ovn-kubernetes\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.106464 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106218 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-etc-systemd\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.106464 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106237 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-host-cni-bin\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.106464 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106252 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prqtx\" (UniqueName: \"kubernetes.io/projected/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-kube-api-access-prqtx\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.106464 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106268 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7330bfa4-019f-4fdd-bc61-5919a528f3e1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.107173 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106286 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7330bfa4-019f-4fdd-bc61-5919a528f3e1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.107173 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106306 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-etc-openvswitch\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.107173 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106329 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-run-ovn\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.107173 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106350 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-etc-kubernetes\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.107173 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106384 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-sys\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.107173 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106413 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-tmp\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.107173 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106436 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0508d934-e625-43c2-940e-b3bcd7809d6d-device-dir\") pod \"aws-ebs-csi-driver-node-4mslc\" (UID: \"0508d934-e625-43c2-940e-b3bcd7809d6d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.107173 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106460 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-systemd-units\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.107173 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106743 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-run-openvswitch\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.107173 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106768 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-host\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.107173 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106793 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-host-run-multus-certs\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.107173 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106816 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0508d934-e625-43c2-940e-b3bcd7809d6d-etc-selinux\") pod \"aws-ebs-csi-driver-node-4mslc\" (UID: \"0508d934-e625-43c2-940e-b3bcd7809d6d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.107173 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106839 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9ab7948f-df7f-4fae-ad8e-a2cab21c427e-hosts-file\") pod \"node-resolver-6fwv8\" (UID: \"9ab7948f-df7f-4fae-ad8e-a2cab21c427e\") " pod="openshift-dns/node-resolver-6fwv8" Apr 16 08:33:09.107173 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106862 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-run\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.107173 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106904 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-etc-tuned\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.107173 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.106965 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7330bfa4-019f-4fdd-bc61-5919a528f3e1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.141432 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.141407 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 08:28:08 +0000 UTC" deadline="2028-01-10 02:46:50.07781201 +0000 UTC" Apr 16 08:33:09.141432 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.141432 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15210h13m40.936383321s" Apr 16 08:33:09.193182 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.193161 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 08:33:09.207473 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207442 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prqtx\" (UniqueName: \"kubernetes.io/projected/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-kube-api-access-prqtx\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.207473 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207473 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7330bfa4-019f-4fdd-bc61-5919a528f3e1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.207651 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207491 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7330bfa4-019f-4fdd-bc61-5919a528f3e1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.207651 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207514 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-etc-openvswitch\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.207651 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207538 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-run-ovn\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.207651 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207561 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-etc-kubernetes\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.207651 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207595 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-etc-openvswitch\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.207651 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207604 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-sys\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.207651 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207614 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-run-ovn\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.207651 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207626 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-tmp\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.207651 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207650 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0508d934-e625-43c2-940e-b3bcd7809d6d-device-dir\") pod \"aws-ebs-csi-driver-node-4mslc\" (UID: \"0508d934-e625-43c2-940e-b3bcd7809d6d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.208137 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207663 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-etc-kubernetes\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.208137 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207676 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-systemd-units\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.208137 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207681 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-sys\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.208137 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207699 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-run-openvswitch\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.208137 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207723 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-host\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.208137 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207729 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0508d934-e625-43c2-940e-b3bcd7809d6d-device-dir\") pod \"aws-ebs-csi-driver-node-4mslc\" (UID: \"0508d934-e625-43c2-940e-b3bcd7809d6d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.208137 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207747 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-host-run-multus-certs\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.208137 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207745 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-systemd-units\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.208137 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207771 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0508d934-e625-43c2-940e-b3bcd7809d6d-etc-selinux\") pod \"aws-ebs-csi-driver-node-4mslc\" (UID: \"0508d934-e625-43c2-940e-b3bcd7809d6d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.208137 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207806 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9ab7948f-df7f-4fae-ad8e-a2cab21c427e-hosts-file\") pod \"node-resolver-6fwv8\" (UID: \"9ab7948f-df7f-4fae-ad8e-a2cab21c427e\") " pod="openshift-dns/node-resolver-6fwv8" Apr 16 08:33:09.208137 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207819 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-run-openvswitch\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.208137 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207830 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-run\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.208137 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207862 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-etc-tuned\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.208137 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207905 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7330bfa4-019f-4fdd-bc61-5919a528f3e1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.208137 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207926 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-host-run-multus-certs\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.208137 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207958 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0508d934-e625-43c2-940e-b3bcd7809d6d-etc-selinux\") pod \"aws-ebs-csi-driver-node-4mslc\" (UID: \"0508d934-e625-43c2-940e-b3bcd7809d6d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.208137 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.207935 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-cnibin\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.208137 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208003 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-cnibin\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.208774 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208015 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0508d934-e625-43c2-940e-b3bcd7809d6d-registration-dir\") pod \"aws-ebs-csi-driver-node-4mslc\" (UID: \"0508d934-e625-43c2-940e-b3bcd7809d6d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.208774 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208050 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-host-kubelet\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.208774 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208061 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-host\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.208774 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208001 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 08:33:09.208774 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208079 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-node-log\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.208774 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208106 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-var-lib-kubelet\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.208774 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208115 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0508d934-e625-43c2-940e-b3bcd7809d6d-registration-dir\") pod \"aws-ebs-csi-driver-node-4mslc\" (UID: \"0508d934-e625-43c2-940e-b3bcd7809d6d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.208774 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208132 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-etc-kubernetes\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.208774 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208160 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2dd4e090-3d55-499c-a3fb-7a04e930e31c-host\") pod \"node-ca-hsx9c\" (UID: \"2dd4e090-3d55-499c-a3fb-7a04e930e31c\") " pod="openshift-image-registry/node-ca-hsx9c" Apr 16 08:33:09.208774 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208165 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-node-log\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.208774 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208189 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs\") pod \"network-metrics-daemon-vdxz4\" (UID: \"53e35ca8-ec77-48b7-8e96-ae73f7083c85\") " pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:09.208774 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208209 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7330bfa4-019f-4fdd-bc61-5919a528f3e1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.208774 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208223 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-ovnkube-script-lib\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.208774 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208202 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-host-kubelet\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.208774 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208252 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3ab93a09-2404-44a9-8381-13976f5a1595-agent-certs\") pod \"konnectivity-agent-j7rrw\" (UID: \"3ab93a09-2404-44a9-8381-13976f5a1595\") " pod="kube-system/konnectivity-agent-j7rrw" Apr 16 08:33:09.208774 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208248 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-etc-kubernetes\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.208774 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208289 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-run-systemd\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.208774 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208292 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-var-lib-kubelet\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.209734 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208316 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-var-lib-openvswitch\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.209734 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208340 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-host-run-netns\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.209734 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208341 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-run\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.209734 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208390 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-host-run-netns\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.209734 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208375 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-var-lib-openvswitch\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.209734 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208440 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-run-systemd\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.209734 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208443 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2dd4e090-3d55-499c-a3fb-7a04e930e31c-host\") pod \"node-ca-hsx9c\" (UID: \"2dd4e090-3d55-499c-a3fb-7a04e930e31c\") " pod="openshift-image-registry/node-ca-hsx9c" Apr 16 08:33:09.209734 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208456 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7330bfa4-019f-4fdd-bc61-5919a528f3e1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.209734 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208476 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ed4414d4-f963-4c14-87d5-738798aeb287-multus-daemon-config\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.209734 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208502 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9ab7948f-df7f-4fae-ad8e-a2cab21c427e-hosts-file\") pod \"node-resolver-6fwv8\" (UID: \"9ab7948f-df7f-4fae-ad8e-a2cab21c427e\") " pod="openshift-dns/node-resolver-6fwv8" Apr 16 08:33:09.209734 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208509 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdvsk\" (UniqueName: \"kubernetes.io/projected/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-kube-api-access-cdvsk\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.209734 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208552 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7886z\" (UniqueName: \"kubernetes.io/projected/3b6d17c9-2d51-47bc-9e36-95fb034872cb-kube-api-access-7886z\") pod \"network-check-target-ntc92\" (UID: \"3b6d17c9-2d51-47bc-9e36-95fb034872cb\") " pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:09.209734 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208584 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkk6b\" (UniqueName: \"kubernetes.io/projected/da769e7e-5234-4962-b0d5-107292fb0b9f-kube-api-access-bkk6b\") pod \"iptables-alerter-fpxkh\" (UID: \"da769e7e-5234-4962-b0d5-107292fb0b9f\") " pod="openshift-network-operator/iptables-alerter-fpxkh" Apr 16 08:33:09.209734 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.209480 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7330bfa4-019f-4fdd-bc61-5919a528f3e1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.209734 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.208630 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-host-run-netns\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.209734 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.209639 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-host-cni-netd\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.209734 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.209678 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-ovnkube-script-lib\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.210502 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.209692 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-etc-sysctl-conf\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.210502 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.209727 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da769e7e-5234-4962-b0d5-107292fb0b9f-host-slash\") pod \"iptables-alerter-fpxkh\" (UID: \"da769e7e-5234-4962-b0d5-107292fb0b9f\") " pod="openshift-network-operator/iptables-alerter-fpxkh" Apr 16 08:33:09.210502 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.209764 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-host-cni-netd\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.210502 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.209775 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-env-overrides\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.210502 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.209844 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-etc-sysctl-conf\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.210502 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.209850 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da769e7e-5234-4962-b0d5-107292fb0b9f-host-slash\") pod \"iptables-alerter-fpxkh\" (UID: \"da769e7e-5234-4962-b0d5-107292fb0b9f\") " pod="openshift-network-operator/iptables-alerter-fpxkh" Apr 16 08:33:09.210502 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.209920 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-multus-conf-dir\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.210502 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.209964 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bzg6\" (UniqueName: \"kubernetes.io/projected/ed4414d4-f963-4c14-87d5-738798aeb287-kube-api-access-6bzg6\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.210502 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210019 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-host-run-netns\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.210502 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210057 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-multus-conf-dir\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.210502 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210091 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0508d934-e625-43c2-940e-b3bcd7809d6d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4mslc\" (UID: \"0508d934-e625-43c2-940e-b3bcd7809d6d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.210502 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210160 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0508d934-e625-43c2-940e-b3bcd7809d6d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4mslc\" (UID: \"0508d934-e625-43c2-940e-b3bcd7809d6d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.210502 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210230 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0508d934-e625-43c2-940e-b3bcd7809d6d-socket-dir\") pod \"aws-ebs-csi-driver-node-4mslc\" (UID: \"0508d934-e625-43c2-940e-b3bcd7809d6d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.210502 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210275 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rx746\" (UniqueName: \"kubernetes.io/projected/9ab7948f-df7f-4fae-ad8e-a2cab21c427e-kube-api-access-rx746\") pod \"node-resolver-6fwv8\" (UID: \"9ab7948f-df7f-4fae-ad8e-a2cab21c427e\") " pod="openshift-dns/node-resolver-6fwv8" Apr 16 08:33:09.210502 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210310 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bdfrf\" (UniqueName: \"kubernetes.io/projected/2dd4e090-3d55-499c-a3fb-7a04e930e31c-kube-api-access-bdfrf\") pod \"node-ca-hsx9c\" (UID: \"2dd4e090-3d55-499c-a3fb-7a04e930e31c\") " pod="openshift-image-registry/node-ca-hsx9c" Apr 16 08:33:09.210502 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210336 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/da769e7e-5234-4962-b0d5-107292fb0b9f-iptables-alerter-script\") pod \"iptables-alerter-fpxkh\" (UID: \"da769e7e-5234-4962-b0d5-107292fb0b9f\") " pod="openshift-network-operator/iptables-alerter-fpxkh" Apr 16 08:33:09.210502 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210365 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-log-socket\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.211261 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210373 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-env-overrides\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.211261 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210418 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-etc-sysctl-d\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.211261 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210487 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7330bfa4-019f-4fdd-bc61-5919a528f3e1-cni-binary-copy\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.211261 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210522 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-etc-sysctl-d\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.211261 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210530 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7lm7\" (UniqueName: \"kubernetes.io/projected/7330bfa4-019f-4fdd-bc61-5919a528f3e1-kube-api-access-f7lm7\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.211261 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210588 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-host-var-lib-kubelet\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.211261 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210657 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2dd4e090-3d55-499c-a3fb-7a04e930e31c-serviceca\") pod \"node-ca-hsx9c\" (UID: \"2dd4e090-3d55-499c-a3fb-7a04e930e31c\") " pod="openshift-image-registry/node-ca-hsx9c" Apr 16 08:33:09.211261 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210700 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-system-cni-dir\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.211261 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210726 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9ab7948f-df7f-4fae-ad8e-a2cab21c427e-tmp-dir\") pod \"node-resolver-6fwv8\" (UID: \"9ab7948f-df7f-4fae-ad8e-a2cab21c427e\") " pod="openshift-dns/node-resolver-6fwv8" Apr 16 08:33:09.211261 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210772 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3ab93a09-2404-44a9-8381-13976f5a1595-konnectivity-ca\") pod \"konnectivity-agent-j7rrw\" (UID: \"3ab93a09-2404-44a9-8381-13976f5a1595\") " pod="kube-system/konnectivity-agent-j7rrw" Apr 16 08:33:09.211261 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210827 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-etc-sysconfig\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.211261 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210962 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0508d934-e625-43c2-940e-b3bcd7809d6d-socket-dir\") pod \"aws-ebs-csi-driver-node-4mslc\" (UID: \"0508d934-e625-43c2-940e-b3bcd7809d6d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.211261 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.210999 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7330bfa4-019f-4fdd-bc61-5919a528f3e1-cnibin\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.211261 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.211025 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-os-release\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.211261 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.211075 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-host-var-lib-cni-bin\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.211821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.211209 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-host-var-lib-cni-multus\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.211821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.211321 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0508d934-e625-43c2-940e-b3bcd7809d6d-sys-fs\") pod \"aws-ebs-csi-driver-node-4mslc\" (UID: \"0508d934-e625-43c2-940e-b3bcd7809d6d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.211821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.211389 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-host-slash\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.211821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.211415 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-etc-modprobe-d\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.211821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.211441 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7330bfa4-019f-4fdd-bc61-5919a528f3e1-system-cni-dir\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.211821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.211613 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7330bfa4-019f-4fdd-bc61-5919a528f3e1-os-release\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.211821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.211634 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-multus-socket-dir-parent\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.211821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.211658 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-lib-modules\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.211821 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.211694 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9ab7948f-df7f-4fae-ad8e-a2cab21c427e-tmp-dir\") pod \"node-resolver-6fwv8\" (UID: \"9ab7948f-df7f-4fae-ad8e-a2cab21c427e\") " pod="openshift-dns/node-resolver-6fwv8" Apr 16 08:33:09.212122 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.211710 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ed4414d4-f963-4c14-87d5-738798aeb287-cni-binary-copy\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.212122 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.211934 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-ovn-node-metrics-cert\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.212122 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.211968 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.212122 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.211999 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7330bfa4-019f-4fdd-bc61-5919a528f3e1-cni-binary-copy\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.212122 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.212029 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-tmp\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.212122 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.212051 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-host-run-k8s-cni-cncf-io\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.212122 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.212103 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-hostroot\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.212373 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.212147 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfmwq\" (UniqueName: \"kubernetes.io/projected/0508d934-e625-43c2-940e-b3bcd7809d6d-kube-api-access-hfmwq\") pod \"aws-ebs-csi-driver-node-4mslc\" (UID: \"0508d934-e625-43c2-940e-b3bcd7809d6d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.212373 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.212138 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-host-var-lib-kubelet\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.212373 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.212189 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-multus-cni-dir\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.212373 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.212239 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-log-socket\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.212373 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.212249 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-host-run-k8s-cni-cncf-io\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.212373 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.212267 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-multus-cni-dir\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.212373 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.212300 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-host-slash\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.212589 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.212464 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/da769e7e-5234-4962-b0d5-107292fb0b9f-iptables-alerter-script\") pod \"iptables-alerter-fpxkh\" (UID: \"da769e7e-5234-4962-b0d5-107292fb0b9f\") " pod="openshift-network-operator/iptables-alerter-fpxkh" Apr 16 08:33:09.212589 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.212474 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7330bfa4-019f-4fdd-bc61-5919a528f3e1-os-release\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.212589 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.212582 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-system-cni-dir\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.212682 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.212642 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.212742 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.212721 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-multus-socket-dir-parent\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.212933 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.212912 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3ab93a09-2404-44a9-8381-13976f5a1595-konnectivity-ca\") pod \"konnectivity-agent-j7rrw\" (UID: \"3ab93a09-2404-44a9-8381-13976f5a1595\") " pod="kube-system/konnectivity-agent-j7rrw" Apr 16 08:33:09.213077 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.213037 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-hostroot\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.213077 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.213063 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2dd4e090-3d55-499c-a3fb-7a04e930e31c-serviceca\") pod \"node-ca-hsx9c\" (UID: \"2dd4e090-3d55-499c-a3fb-7a04e930e31c\") " pod="openshift-image-registry/node-ca-hsx9c" Apr 16 08:33:09.213216 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.213171 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-etc-modprobe-d\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.213329 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.213313 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-etc-sysconfig\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.213395 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.213336 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3ab93a09-2404-44a9-8381-13976f5a1595-agent-certs\") pod \"konnectivity-agent-j7rrw\" (UID: \"3ab93a09-2404-44a9-8381-13976f5a1595\") " pod="kube-system/konnectivity-agent-j7rrw" Apr 16 08:33:09.213395 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.213375 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7330bfa4-019f-4fdd-bc61-5919a528f3e1-system-cni-dir\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.213486 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.213417 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-ovnkube-config\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.213486 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.213440 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-host-var-lib-cni-bin\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.213486 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.213451 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-host-var-lib-cni-multus\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.213486 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.213455 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-host-run-ovn-kubernetes\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.213677 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.213499 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-host-run-ovn-kubernetes\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.213677 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.213505 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-etc-systemd\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.213677 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.213547 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2wrn\" (UniqueName: \"kubernetes.io/projected/53e35ca8-ec77-48b7-8e96-ae73f7083c85-kube-api-access-v2wrn\") pod \"network-metrics-daemon-vdxz4\" (UID: \"53e35ca8-ec77-48b7-8e96-ae73f7083c85\") " pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:09.213677 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.213551 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7330bfa4-019f-4fdd-bc61-5919a528f3e1-cnibin\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.213677 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.213417 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ed4414d4-f963-4c14-87d5-738798aeb287-os-release\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.213677 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.213594 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-etc-systemd\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.213677 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.213630 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-host-cni-bin\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.214015 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.213714 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-host-cni-bin\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.214015 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.213761 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0508d934-e625-43c2-940e-b3bcd7809d6d-sys-fs\") pod \"aws-ebs-csi-driver-node-4mslc\" (UID: \"0508d934-e625-43c2-940e-b3bcd7809d6d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.214015 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.213978 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-ovnkube-config\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.214162 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.214119 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ed4414d4-f963-4c14-87d5-738798aeb287-cni-binary-copy\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.214936 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.214324 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-lib-modules\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.215473 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.215449 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ed4414d4-f963-4c14-87d5-738798aeb287-multus-daemon-config\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.216043 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.216025 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-etc-tuned\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.216520 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:09.216269 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:33:09.216520 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:09.216297 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:33:09.216520 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:09.216310 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7886z for pod openshift-network-diagnostics/network-check-target-ntc92: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:09.216783 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:09.216699 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6d17c9-2d51-47bc-9e36-95fb034872cb-kube-api-access-7886z podName:3b6d17c9-2d51-47bc-9e36-95fb034872cb nodeName:}" failed. No retries permitted until 2026-04-16 08:33:09.716492701 +0000 UTC m=+3.123531664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7886z" (UniqueName: "kubernetes.io/projected/3b6d17c9-2d51-47bc-9e36-95fb034872cb-kube-api-access-7886z") pod "network-check-target-ntc92" (UID: "3b6d17c9-2d51-47bc-9e36-95fb034872cb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:09.217218 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.217195 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-ovn-node-metrics-cert\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.218651 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.218482 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkk6b\" (UniqueName: \"kubernetes.io/projected/da769e7e-5234-4962-b0d5-107292fb0b9f-kube-api-access-bkk6b\") pod \"iptables-alerter-fpxkh\" (UID: \"da769e7e-5234-4962-b0d5-107292fb0b9f\") " pod="openshift-network-operator/iptables-alerter-fpxkh" Apr 16 08:33:09.219083 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.219066 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdvsk\" (UniqueName: \"kubernetes.io/projected/bc5726ae-d0b1-473a-9fd1-b1085d2b108e-kube-api-access-cdvsk\") pod \"tuned-t6g8t\" (UID: \"bc5726ae-d0b1-473a-9fd1-b1085d2b108e\") " pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.220602 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.220534 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prqtx\" (UniqueName: \"kubernetes.io/projected/09a74c48-2de1-498a-893b-0fa1b8dbd0dd-kube-api-access-prqtx\") pod \"ovnkube-node-m577d\" (UID: \"09a74c48-2de1-498a-893b-0fa1b8dbd0dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.222806 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.222786 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bzg6\" (UniqueName: \"kubernetes.io/projected/ed4414d4-f963-4c14-87d5-738798aeb287-kube-api-access-6bzg6\") pod \"multus-mkvns\" (UID: \"ed4414d4-f963-4c14-87d5-738798aeb287\") " pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.224378 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.224354 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx746\" (UniqueName: \"kubernetes.io/projected/9ab7948f-df7f-4fae-ad8e-a2cab21c427e-kube-api-access-rx746\") pod \"node-resolver-6fwv8\" (UID: \"9ab7948f-df7f-4fae-ad8e-a2cab21c427e\") " pod="openshift-dns/node-resolver-6fwv8" Apr 16 08:33:09.224465 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.224419 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdfrf\" (UniqueName: \"kubernetes.io/projected/2dd4e090-3d55-499c-a3fb-7a04e930e31c-kube-api-access-bdfrf\") pod \"node-ca-hsx9c\" (UID: \"2dd4e090-3d55-499c-a3fb-7a04e930e31c\") " pod="openshift-image-registry/node-ca-hsx9c" Apr 16 08:33:09.225218 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.225200 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7lm7\" (UniqueName: \"kubernetes.io/projected/7330bfa4-019f-4fdd-bc61-5919a528f3e1-kube-api-access-f7lm7\") pod \"multus-additional-cni-plugins-6cdbx\" (UID: \"7330bfa4-019f-4fdd-bc61-5919a528f3e1\") " pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.225218 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.225209 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfmwq\" (UniqueName: \"kubernetes.io/projected/0508d934-e625-43c2-940e-b3bcd7809d6d-kube-api-access-hfmwq\") pod \"aws-ebs-csi-driver-node-4mslc\" (UID: \"0508d934-e625-43c2-940e-b3bcd7809d6d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.314126 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.314095 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2wrn\" (UniqueName: \"kubernetes.io/projected/53e35ca8-ec77-48b7-8e96-ae73f7083c85-kube-api-access-v2wrn\") pod \"network-metrics-daemon-vdxz4\" (UID: \"53e35ca8-ec77-48b7-8e96-ae73f7083c85\") " pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:09.314309 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.314135 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs\") pod \"network-metrics-daemon-vdxz4\" (UID: \"53e35ca8-ec77-48b7-8e96-ae73f7083c85\") " pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:09.314309 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:09.314243 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:09.314309 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:09.314307 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs podName:53e35ca8-ec77-48b7-8e96-ae73f7083c85 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:09.814287529 +0000 UTC m=+3.221326488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs") pod "network-metrics-daemon-vdxz4" (UID: "53e35ca8-ec77-48b7-8e96-ae73f7083c85") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:09.323868 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.323801 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2wrn\" (UniqueName: \"kubernetes.io/projected/53e35ca8-ec77-48b7-8e96-ae73f7083c85-kube-api-access-v2wrn\") pod \"network-metrics-daemon-vdxz4\" (UID: \"53e35ca8-ec77-48b7-8e96-ae73f7083c85\") " pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:09.392695 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.392662 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-j7rrw" Apr 16 08:33:09.401515 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.401494 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mkvns" Apr 16 08:33:09.411116 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.411101 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fpxkh" Apr 16 08:33:09.416917 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.416860 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:09.423554 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.423535 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" Apr 16 08:33:09.430162 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.430136 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" Apr 16 08:33:09.436719 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.436701 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6fwv8" Apr 16 08:33:09.443351 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.443330 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hsx9c" Apr 16 08:33:09.449871 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.449850 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6cdbx" Apr 16 08:33:09.537604 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.537584 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:33:09.817160 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.817130 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs\") pod \"network-metrics-daemon-vdxz4\" (UID: \"53e35ca8-ec77-48b7-8e96-ae73f7083c85\") " pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:09.817262 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:09.817181 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7886z\" (UniqueName: \"kubernetes.io/projected/3b6d17c9-2d51-47bc-9e36-95fb034872cb-kube-api-access-7886z\") pod \"network-check-target-ntc92\" (UID: \"3b6d17c9-2d51-47bc-9e36-95fb034872cb\") " pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:09.817327 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:09.817308 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:09.817404 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:09.817379 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs podName:53e35ca8-ec77-48b7-8e96-ae73f7083c85 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:10.817358722 +0000 UTC m=+4.224397679 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs") pod "network-metrics-daemon-vdxz4" (UID: "53e35ca8-ec77-48b7-8e96-ae73f7083c85") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:09.817609 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:09.817309 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:33:09.817683 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:09.817627 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:33:09.817683 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:09.817642 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7886z for pod openshift-network-diagnostics/network-check-target-ntc92: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:09.817793 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:09.817694 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6d17c9-2d51-47bc-9e36-95fb034872cb-kube-api-access-7886z podName:3b6d17c9-2d51-47bc-9e36-95fb034872cb nodeName:}" failed. No retries permitted until 2026-04-16 08:33:10.817679631 +0000 UTC m=+4.224718579 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7886z" (UniqueName: "kubernetes.io/projected/3b6d17c9-2d51-47bc-9e36-95fb034872cb-kube-api-access-7886z") pod "network-check-target-ntc92" (UID: "3b6d17c9-2d51-47bc-9e36-95fb034872cb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:09.822399 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:09.822336 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc5726ae_d0b1_473a_9fd1_b1085d2b108e.slice/crio-b6f9ca552a4a0bdde933a541e73a4e4b3be54d02716f642ddd3963e6d9dd0f57 WatchSource:0}: Error finding container b6f9ca552a4a0bdde933a541e73a4e4b3be54d02716f642ddd3963e6d9dd0f57: Status 404 returned error can't find the container with id b6f9ca552a4a0bdde933a541e73a4e4b3be54d02716f642ddd3963e6d9dd0f57 Apr 16 08:33:09.826586 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:09.826560 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda769e7e_5234_4962_b0d5_107292fb0b9f.slice/crio-6d7cbf52fd530314d0b8ff86718f5ab786df8522ea64a7374afceb7fa9e0d7c5 WatchSource:0}: Error finding container 6d7cbf52fd530314d0b8ff86718f5ab786df8522ea64a7374afceb7fa9e0d7c5: Status 404 returned error can't find the container with id 6d7cbf52fd530314d0b8ff86718f5ab786df8522ea64a7374afceb7fa9e0d7c5 Apr 16 08:33:09.827175 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:09.827095 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09a74c48_2de1_498a_893b_0fa1b8dbd0dd.slice/crio-bb1779eef910f1b43d199f1fe0ab12873531ca26f002e08cc02ef99d9c501f44 WatchSource:0}: Error finding container bb1779eef910f1b43d199f1fe0ab12873531ca26f002e08cc02ef99d9c501f44: Status 404 returned error can't find the container with id bb1779eef910f1b43d199f1fe0ab12873531ca26f002e08cc02ef99d9c501f44 Apr 16 08:33:09.828107 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:09.828028 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded4414d4_f963_4c14_87d5_738798aeb287.slice/crio-1715cccec9895449590667da954b2239504437361745b2e09b935886d767f196 WatchSource:0}: Error finding container 1715cccec9895449590667da954b2239504437361745b2e09b935886d767f196: Status 404 returned error can't find the container with id 1715cccec9895449590667da954b2239504437361745b2e09b935886d767f196 Apr 16 08:33:09.829315 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:09.828813 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ab7948f_df7f_4fae_ad8e_a2cab21c427e.slice/crio-ae7b4de7c1925cfb7daaa187eeb21fea38b88e4c10823183ba84bc0ee6a7bd4a WatchSource:0}: Error finding container ae7b4de7c1925cfb7daaa187eeb21fea38b88e4c10823183ba84bc0ee6a7bd4a: Status 404 returned error can't find the container with id ae7b4de7c1925cfb7daaa187eeb21fea38b88e4c10823183ba84bc0ee6a7bd4a Apr 16 08:33:09.833960 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:09.833536 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7330bfa4_019f_4fdd_bc61_5919a528f3e1.slice/crio-8e7cfda13a8a525f2a056d3c5dc63fd787cff65f0012cd1c65ea34b10467ef6d WatchSource:0}: Error finding container 8e7cfda13a8a525f2a056d3c5dc63fd787cff65f0012cd1c65ea34b10467ef6d: Status 404 returned error can't find the container with id 8e7cfda13a8a525f2a056d3c5dc63fd787cff65f0012cd1c65ea34b10467ef6d Apr 16 08:33:09.835225 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:09.835202 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0508d934_e625_43c2_940e_b3bcd7809d6d.slice/crio-1e37b94d25b4358ae0a84d43b112ede10b09e452073a0f07c07e3b0db59f61ad WatchSource:0}: Error finding container 1e37b94d25b4358ae0a84d43b112ede10b09e452073a0f07c07e3b0db59f61ad: Status 404 returned error can't find the container with id 1e37b94d25b4358ae0a84d43b112ede10b09e452073a0f07c07e3b0db59f61ad Apr 16 08:33:10.142565 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:10.142364 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 08:28:08 +0000 UTC" deadline="2027-11-19 22:57:22.624523222 +0000 UTC" Apr 16 08:33:10.142565 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:10.142562 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13982h24m12.481965728s" Apr 16 08:33:10.191072 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:10.191012 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" event={"ID":"0508d934-e625-43c2-940e-b3bcd7809d6d","Type":"ContainerStarted","Data":"1e37b94d25b4358ae0a84d43b112ede10b09e452073a0f07c07e3b0db59f61ad"} Apr 16 08:33:10.192426 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:10.192394 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6cdbx" event={"ID":"7330bfa4-019f-4fdd-bc61-5919a528f3e1","Type":"ContainerStarted","Data":"8e7cfda13a8a525f2a056d3c5dc63fd787cff65f0012cd1c65ea34b10467ef6d"} Apr 16 08:33:10.193715 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:10.193689 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hsx9c" event={"ID":"2dd4e090-3d55-499c-a3fb-7a04e930e31c","Type":"ContainerStarted","Data":"6704e70d4c7be2fec698061a9e917a0ad49b22149d6bdf7733512b1b1a69ba92"} Apr 16 08:33:10.195487 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:10.195464 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-j7rrw" event={"ID":"3ab93a09-2404-44a9-8381-13976f5a1595","Type":"ContainerStarted","Data":"599d5b79b4903484a93e22be46ffc8c265b41d918b31e2bf9c7841d33bdda82d"} Apr 16 08:33:10.196651 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:10.196628 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mkvns" event={"ID":"ed4414d4-f963-4c14-87d5-738798aeb287","Type":"ContainerStarted","Data":"1715cccec9895449590667da954b2239504437361745b2e09b935886d767f196"} Apr 16 08:33:10.198222 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:10.198197 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m577d" event={"ID":"09a74c48-2de1-498a-893b-0fa1b8dbd0dd","Type":"ContainerStarted","Data":"bb1779eef910f1b43d199f1fe0ab12873531ca26f002e08cc02ef99d9c501f44"} Apr 16 08:33:10.199308 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:10.199282 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fpxkh" event={"ID":"da769e7e-5234-4962-b0d5-107292fb0b9f","Type":"ContainerStarted","Data":"6d7cbf52fd530314d0b8ff86718f5ab786df8522ea64a7374afceb7fa9e0d7c5"} Apr 16 08:33:10.200515 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:10.200492 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6fwv8" event={"ID":"9ab7948f-df7f-4fae-ad8e-a2cab21c427e","Type":"ContainerStarted","Data":"ae7b4de7c1925cfb7daaa187eeb21fea38b88e4c10823183ba84bc0ee6a7bd4a"} Apr 16 08:33:10.201782 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:10.201749 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" event={"ID":"bc5726ae-d0b1-473a-9fd1-b1085d2b108e","Type":"ContainerStarted","Data":"b6f9ca552a4a0bdde933a541e73a4e4b3be54d02716f642ddd3963e6d9dd0f57"} Apr 16 08:33:10.204984 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:10.204948 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-41.ec2.internal" event={"ID":"2c43aaa946be06bbbb13a479a93166e2","Type":"ContainerStarted","Data":"7e7fc3e3f7f4f79b04fde4db907c94d1f1f4e83a1a1a0415f1cd20a2fac520c1"} Apr 16 08:33:10.220924 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:10.220859 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-41.ec2.internal" podStartSLOduration=2.220844332 podStartE2EDuration="2.220844332s" podCreationTimestamp="2026-04-16 08:33:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:33:10.220257365 +0000 UTC m=+3.627296329" watchObservedRunningTime="2026-04-16 08:33:10.220844332 +0000 UTC m=+3.627883296" Apr 16 08:33:10.826693 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:10.826308 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs\") pod \"network-metrics-daemon-vdxz4\" (UID: \"53e35ca8-ec77-48b7-8e96-ae73f7083c85\") " pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:10.826693 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:10.826360 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7886z\" (UniqueName: \"kubernetes.io/projected/3b6d17c9-2d51-47bc-9e36-95fb034872cb-kube-api-access-7886z\") pod \"network-check-target-ntc92\" (UID: \"3b6d17c9-2d51-47bc-9e36-95fb034872cb\") " pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:10.826693 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:10.826463 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:10.826693 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:10.826515 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:33:10.826693 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:10.826534 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:33:10.826693 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:10.826545 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs podName:53e35ca8-ec77-48b7-8e96-ae73f7083c85 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:12.826517088 +0000 UTC m=+6.233556033 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs") pod "network-metrics-daemon-vdxz4" (UID: "53e35ca8-ec77-48b7-8e96-ae73f7083c85") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:10.826693 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:10.826546 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7886z for pod openshift-network-diagnostics/network-check-target-ntc92: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:10.826693 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:10.826615 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6d17c9-2d51-47bc-9e36-95fb034872cb-kube-api-access-7886z podName:3b6d17c9-2d51-47bc-9e36-95fb034872cb nodeName:}" failed. No retries permitted until 2026-04-16 08:33:12.82659906 +0000 UTC m=+6.233638004 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7886z" (UniqueName: "kubernetes.io/projected/3b6d17c9-2d51-47bc-9e36-95fb034872cb-kube-api-access-7886z") pod "network-check-target-ntc92" (UID: "3b6d17c9-2d51-47bc-9e36-95fb034872cb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:11.186065 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:11.185207 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:11.186065 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:11.185375 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntc92" podUID="3b6d17c9-2d51-47bc-9e36-95fb034872cb" Apr 16 08:33:11.186065 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:11.185902 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:11.186065 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:11.186027 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vdxz4" podUID="53e35ca8-ec77-48b7-8e96-ae73f7083c85" Apr 16 08:33:11.218199 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:11.218159 2579 generic.go:358] "Generic (PLEG): container finished" podID="bb5cb6fea81fd2041eddb8622fb55be0" containerID="689bbec2d47643cfd8317b7cf1bbb37e0593e3cc6c12de745b3c6a0795e45a39" exitCode=0 Apr 16 08:33:11.219173 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:11.219145 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-41.ec2.internal" event={"ID":"bb5cb6fea81fd2041eddb8622fb55be0","Type":"ContainerDied","Data":"689bbec2d47643cfd8317b7cf1bbb37e0593e3cc6c12de745b3c6a0795e45a39"} Apr 16 08:33:11.676937 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:11.676118 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-ktjq7"] Apr 16 08:33:11.678820 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:11.678356 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:11.678820 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:11.678426 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ktjq7" podUID="fca15501-2323-4079-b35a-57d66f40d0b1" Apr 16 08:33:11.733388 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:11.733148 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fca15501-2323-4079-b35a-57d66f40d0b1-dbus\") pod \"global-pull-secret-syncer-ktjq7\" (UID: \"fca15501-2323-4079-b35a-57d66f40d0b1\") " pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:11.733388 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:11.733212 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fca15501-2323-4079-b35a-57d66f40d0b1-original-pull-secret\") pod \"global-pull-secret-syncer-ktjq7\" (UID: \"fca15501-2323-4079-b35a-57d66f40d0b1\") " pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:11.733388 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:11.733255 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fca15501-2323-4079-b35a-57d66f40d0b1-kubelet-config\") pod \"global-pull-secret-syncer-ktjq7\" (UID: \"fca15501-2323-4079-b35a-57d66f40d0b1\") " pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:11.834577 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:11.834540 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fca15501-2323-4079-b35a-57d66f40d0b1-original-pull-secret\") pod \"global-pull-secret-syncer-ktjq7\" (UID: \"fca15501-2323-4079-b35a-57d66f40d0b1\") " pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:11.834790 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:11.834600 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fca15501-2323-4079-b35a-57d66f40d0b1-kubelet-config\") pod \"global-pull-secret-syncer-ktjq7\" (UID: \"fca15501-2323-4079-b35a-57d66f40d0b1\") " pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:11.834790 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:11.834665 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fca15501-2323-4079-b35a-57d66f40d0b1-dbus\") pod \"global-pull-secret-syncer-ktjq7\" (UID: \"fca15501-2323-4079-b35a-57d66f40d0b1\") " pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:11.834999 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:11.834840 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fca15501-2323-4079-b35a-57d66f40d0b1-dbus\") pod \"global-pull-secret-syncer-ktjq7\" (UID: \"fca15501-2323-4079-b35a-57d66f40d0b1\") " pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:11.834999 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:11.834976 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:11.835145 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:11.835034 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fca15501-2323-4079-b35a-57d66f40d0b1-original-pull-secret podName:fca15501-2323-4079-b35a-57d66f40d0b1 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:12.335015894 +0000 UTC m=+5.742054844 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fca15501-2323-4079-b35a-57d66f40d0b1-original-pull-secret") pod "global-pull-secret-syncer-ktjq7" (UID: "fca15501-2323-4079-b35a-57d66f40d0b1") : object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:11.835145 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:11.835096 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fca15501-2323-4079-b35a-57d66f40d0b1-kubelet-config\") pod \"global-pull-secret-syncer-ktjq7\" (UID: \"fca15501-2323-4079-b35a-57d66f40d0b1\") " pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:12.233635 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:12.233569 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-41.ec2.internal" event={"ID":"bb5cb6fea81fd2041eddb8622fb55be0","Type":"ContainerStarted","Data":"d2c4672689f4a1e063bd1957d87fb0f2b71f335e5eea8687b31a0efc8e3d7ff7"} Apr 16 08:33:12.252460 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:12.251666 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-41.ec2.internal" podStartSLOduration=4.251653579 podStartE2EDuration="4.251653579s" podCreationTimestamp="2026-04-16 08:33:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:33:12.24708768 +0000 UTC m=+5.654126643" watchObservedRunningTime="2026-04-16 08:33:12.251653579 +0000 UTC m=+5.658692543" Apr 16 08:33:12.341121 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:12.340561 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fca15501-2323-4079-b35a-57d66f40d0b1-original-pull-secret\") pod \"global-pull-secret-syncer-ktjq7\" (UID: \"fca15501-2323-4079-b35a-57d66f40d0b1\") " pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:12.341121 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:12.340708 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:12.341121 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:12.340768 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fca15501-2323-4079-b35a-57d66f40d0b1-original-pull-secret podName:fca15501-2323-4079-b35a-57d66f40d0b1 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:13.340748111 +0000 UTC m=+6.747787053 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fca15501-2323-4079-b35a-57d66f40d0b1-original-pull-secret") pod "global-pull-secret-syncer-ktjq7" (UID: "fca15501-2323-4079-b35a-57d66f40d0b1") : object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:12.844368 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:12.844316 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs\") pod \"network-metrics-daemon-vdxz4\" (UID: \"53e35ca8-ec77-48b7-8e96-ae73f7083c85\") " pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:12.844558 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:12.844376 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7886z\" (UniqueName: \"kubernetes.io/projected/3b6d17c9-2d51-47bc-9e36-95fb034872cb-kube-api-access-7886z\") pod \"network-check-target-ntc92\" (UID: \"3b6d17c9-2d51-47bc-9e36-95fb034872cb\") " pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:12.844625 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:12.844554 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:33:12.844625 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:12.844561 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:12.844625 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:12.844574 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:33:12.844625 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:12.844586 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7886z for pod openshift-network-diagnostics/network-check-target-ntc92: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:12.844807 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:12.844630 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs podName:53e35ca8-ec77-48b7-8e96-ae73f7083c85 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:16.844612189 +0000 UTC m=+10.251651129 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs") pod "network-metrics-daemon-vdxz4" (UID: "53e35ca8-ec77-48b7-8e96-ae73f7083c85") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:12.844807 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:12.844649 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6d17c9-2d51-47bc-9e36-95fb034872cb-kube-api-access-7886z podName:3b6d17c9-2d51-47bc-9e36-95fb034872cb nodeName:}" failed. No retries permitted until 2026-04-16 08:33:16.844639647 +0000 UTC m=+10.251678591 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7886z" (UniqueName: "kubernetes.io/projected/3b6d17c9-2d51-47bc-9e36-95fb034872cb-kube-api-access-7886z") pod "network-check-target-ntc92" (UID: "3b6d17c9-2d51-47bc-9e36-95fb034872cb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:13.185488 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:13.185407 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:13.185639 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:13.185539 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntc92" podUID="3b6d17c9-2d51-47bc-9e36-95fb034872cb" Apr 16 08:33:13.186161 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:13.185971 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:13.186161 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:13.186072 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vdxz4" podUID="53e35ca8-ec77-48b7-8e96-ae73f7083c85" Apr 16 08:33:13.186375 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:13.186225 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:13.186375 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:13.186330 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ktjq7" podUID="fca15501-2323-4079-b35a-57d66f40d0b1" Apr 16 08:33:13.348786 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:13.348747 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fca15501-2323-4079-b35a-57d66f40d0b1-original-pull-secret\") pod \"global-pull-secret-syncer-ktjq7\" (UID: \"fca15501-2323-4079-b35a-57d66f40d0b1\") " pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:13.349265 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:13.348937 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:13.349265 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:13.349006 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fca15501-2323-4079-b35a-57d66f40d0b1-original-pull-secret podName:fca15501-2323-4079-b35a-57d66f40d0b1 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:15.348991895 +0000 UTC m=+8.756030835 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fca15501-2323-4079-b35a-57d66f40d0b1-original-pull-secret") pod "global-pull-secret-syncer-ktjq7" (UID: "fca15501-2323-4079-b35a-57d66f40d0b1") : object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:15.184727 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:15.184693 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:15.185212 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:15.184824 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ktjq7" podUID="fca15501-2323-4079-b35a-57d66f40d0b1" Apr 16 08:33:15.185212 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:15.184695 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:15.185212 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:15.184945 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vdxz4" podUID="53e35ca8-ec77-48b7-8e96-ae73f7083c85" Apr 16 08:33:15.185503 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:15.185364 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:15.185503 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:15.185468 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntc92" podUID="3b6d17c9-2d51-47bc-9e36-95fb034872cb" Apr 16 08:33:15.364820 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:15.364782 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fca15501-2323-4079-b35a-57d66f40d0b1-original-pull-secret\") pod \"global-pull-secret-syncer-ktjq7\" (UID: \"fca15501-2323-4079-b35a-57d66f40d0b1\") " pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:15.365022 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:15.364936 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:15.365022 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:15.365018 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fca15501-2323-4079-b35a-57d66f40d0b1-original-pull-secret podName:fca15501-2323-4079-b35a-57d66f40d0b1 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:19.364995504 +0000 UTC m=+12.772034446 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fca15501-2323-4079-b35a-57d66f40d0b1-original-pull-secret") pod "global-pull-secret-syncer-ktjq7" (UID: "fca15501-2323-4079-b35a-57d66f40d0b1") : object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:16.879454 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:16.879418 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs\") pod \"network-metrics-daemon-vdxz4\" (UID: \"53e35ca8-ec77-48b7-8e96-ae73f7083c85\") " pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:16.879884 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:16.879468 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7886z\" (UniqueName: \"kubernetes.io/projected/3b6d17c9-2d51-47bc-9e36-95fb034872cb-kube-api-access-7886z\") pod \"network-check-target-ntc92\" (UID: \"3b6d17c9-2d51-47bc-9e36-95fb034872cb\") " pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:16.879884 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:16.879580 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:16.879884 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:16.879585 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:33:16.879884 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:16.879615 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:33:16.879884 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:16.879628 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7886z for pod openshift-network-diagnostics/network-check-target-ntc92: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:16.879884 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:16.879650 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs podName:53e35ca8-ec77-48b7-8e96-ae73f7083c85 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:24.879631099 +0000 UTC m=+18.286670047 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs") pod "network-metrics-daemon-vdxz4" (UID: "53e35ca8-ec77-48b7-8e96-ae73f7083c85") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:16.879884 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:16.879677 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6d17c9-2d51-47bc-9e36-95fb034872cb-kube-api-access-7886z podName:3b6d17c9-2d51-47bc-9e36-95fb034872cb nodeName:}" failed. No retries permitted until 2026-04-16 08:33:24.879661936 +0000 UTC m=+18.286700888 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7886z" (UniqueName: "kubernetes.io/projected/3b6d17c9-2d51-47bc-9e36-95fb034872cb-kube-api-access-7886z") pod "network-check-target-ntc92" (UID: "3b6d17c9-2d51-47bc-9e36-95fb034872cb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:17.185828 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:17.185749 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:17.185994 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:17.185874 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vdxz4" podUID="53e35ca8-ec77-48b7-8e96-ae73f7083c85" Apr 16 08:33:17.186078 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:17.186001 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:17.186078 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:17.186026 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:17.186181 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:17.186132 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ktjq7" podUID="fca15501-2323-4079-b35a-57d66f40d0b1" Apr 16 08:33:17.186269 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:17.186243 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntc92" podUID="3b6d17c9-2d51-47bc-9e36-95fb034872cb" Apr 16 08:33:19.185405 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:19.185328 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:19.185405 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:19.185374 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:19.185815 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:19.185448 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ktjq7" podUID="fca15501-2323-4079-b35a-57d66f40d0b1" Apr 16 08:33:19.185815 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:19.185726 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:19.185815 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:19.185792 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vdxz4" podUID="53e35ca8-ec77-48b7-8e96-ae73f7083c85" Apr 16 08:33:19.185975 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:19.185835 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntc92" podUID="3b6d17c9-2d51-47bc-9e36-95fb034872cb" Apr 16 08:33:19.399408 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:19.399375 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fca15501-2323-4079-b35a-57d66f40d0b1-original-pull-secret\") pod \"global-pull-secret-syncer-ktjq7\" (UID: \"fca15501-2323-4079-b35a-57d66f40d0b1\") " pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:19.399576 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:19.399509 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:19.399576 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:19.399565 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fca15501-2323-4079-b35a-57d66f40d0b1-original-pull-secret podName:fca15501-2323-4079-b35a-57d66f40d0b1 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:27.399547532 +0000 UTC m=+20.806586476 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fca15501-2323-4079-b35a-57d66f40d0b1-original-pull-secret") pod "global-pull-secret-syncer-ktjq7" (UID: "fca15501-2323-4079-b35a-57d66f40d0b1") : object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:21.185007 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:21.184970 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:21.185007 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:21.184989 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:21.185508 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:21.184979 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:21.185508 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:21.185118 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vdxz4" podUID="53e35ca8-ec77-48b7-8e96-ae73f7083c85" Apr 16 08:33:21.185508 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:21.185227 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ktjq7" podUID="fca15501-2323-4079-b35a-57d66f40d0b1" Apr 16 08:33:21.185508 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:21.185318 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntc92" podUID="3b6d17c9-2d51-47bc-9e36-95fb034872cb" Apr 16 08:33:23.184650 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:23.184614 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:23.185129 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:23.184658 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:23.185129 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:23.184630 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:23.185129 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:23.184733 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntc92" podUID="3b6d17c9-2d51-47bc-9e36-95fb034872cb" Apr 16 08:33:23.185129 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:23.184809 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vdxz4" podUID="53e35ca8-ec77-48b7-8e96-ae73f7083c85" Apr 16 08:33:23.185129 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:23.184926 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ktjq7" podUID="fca15501-2323-4079-b35a-57d66f40d0b1" Apr 16 08:33:24.940526 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:24.940479 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs\") pod \"network-metrics-daemon-vdxz4\" (UID: \"53e35ca8-ec77-48b7-8e96-ae73f7083c85\") " pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:24.941152 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:24.940535 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7886z\" (UniqueName: \"kubernetes.io/projected/3b6d17c9-2d51-47bc-9e36-95fb034872cb-kube-api-access-7886z\") pod \"network-check-target-ntc92\" (UID: \"3b6d17c9-2d51-47bc-9e36-95fb034872cb\") " pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:24.941152 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:24.940658 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:24.941152 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:24.940665 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:33:24.941152 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:24.940695 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:33:24.941152 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:24.940708 2579 projected.go:194] Error preparing data for projected volume kube-api-access-7886z for pod openshift-network-diagnostics/network-check-target-ntc92: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:24.941152 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:24.940742 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs podName:53e35ca8-ec77-48b7-8e96-ae73f7083c85 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:40.940720674 +0000 UTC m=+34.347759627 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs") pod "network-metrics-daemon-vdxz4" (UID: "53e35ca8-ec77-48b7-8e96-ae73f7083c85") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:24.941152 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:24.940759 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6d17c9-2d51-47bc-9e36-95fb034872cb-kube-api-access-7886z podName:3b6d17c9-2d51-47bc-9e36-95fb034872cb nodeName:}" failed. No retries permitted until 2026-04-16 08:33:40.940752434 +0000 UTC m=+34.347791375 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7886z" (UniqueName: "kubernetes.io/projected/3b6d17c9-2d51-47bc-9e36-95fb034872cb-kube-api-access-7886z") pod "network-check-target-ntc92" (UID: "3b6d17c9-2d51-47bc-9e36-95fb034872cb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:25.185031 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:25.185002 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:25.185217 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:25.185102 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:25.185217 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:25.185105 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntc92" podUID="3b6d17c9-2d51-47bc-9e36-95fb034872cb" Apr 16 08:33:25.185217 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:25.185003 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:25.185217 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:25.185206 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ktjq7" podUID="fca15501-2323-4079-b35a-57d66f40d0b1" Apr 16 08:33:25.185425 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:25.185255 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vdxz4" podUID="53e35ca8-ec77-48b7-8e96-ae73f7083c85" Apr 16 08:33:27.185585 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:27.185565 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:27.185990 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:27.185622 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:27.185990 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:27.185629 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:27.185990 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:27.185690 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ktjq7" podUID="fca15501-2323-4079-b35a-57d66f40d0b1" Apr 16 08:33:27.185990 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:27.185759 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntc92" podUID="3b6d17c9-2d51-47bc-9e36-95fb034872cb" Apr 16 08:33:27.185990 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:27.185834 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vdxz4" podUID="53e35ca8-ec77-48b7-8e96-ae73f7083c85" Apr 16 08:33:27.260094 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:27.260073 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 08:33:27.260376 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:27.260357 2579 generic.go:358] "Generic (PLEG): container finished" podID="09a74c48-2de1-498a-893b-0fa1b8dbd0dd" containerID="49e557d613f81d02ce2e01056daa04394b4bde789f34fc784bcffc73f8edb81b" exitCode=1 Apr 16 08:33:27.260494 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:27.260420 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m577d" event={"ID":"09a74c48-2de1-498a-893b-0fa1b8dbd0dd","Type":"ContainerStarted","Data":"ed646ba6cf7346a26a0ea5c6fa6bb562b0f55c07950c2c0f8fdfb159a1e84ccf"} Apr 16 08:33:27.260494 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:27.260441 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m577d" event={"ID":"09a74c48-2de1-498a-893b-0fa1b8dbd0dd","Type":"ContainerDied","Data":"49e557d613f81d02ce2e01056daa04394b4bde789f34fc784bcffc73f8edb81b"} Apr 16 08:33:27.260494 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:27.260451 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m577d" event={"ID":"09a74c48-2de1-498a-893b-0fa1b8dbd0dd","Type":"ContainerStarted","Data":"600456a6e2bd3935b38914343917aabd76b94554203e79b933d358260cc30aca"} Apr 16 08:33:27.261518 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:27.261498 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6fwv8" event={"ID":"9ab7948f-df7f-4fae-ad8e-a2cab21c427e","Type":"ContainerStarted","Data":"52ab224836892c94c69dc7c106857b723598b98f3c6950f882540eaca4a1f7ae"} Apr 16 08:33:27.262643 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:27.262618 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" event={"ID":"bc5726ae-d0b1-473a-9fd1-b1085d2b108e","Type":"ContainerStarted","Data":"1ce945c9efb3ff6fffd03f5058633a76717617406160ed40073f3bc4a2070567"} Apr 16 08:33:27.263705 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:27.263688 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" event={"ID":"0508d934-e625-43c2-940e-b3bcd7809d6d","Type":"ContainerStarted","Data":"e0057aba044497020447529ea6c9e1fac08a8365bd7722d33914899501b2042f"} Apr 16 08:33:27.265120 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:27.265094 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6cdbx" event={"ID":"7330bfa4-019f-4fdd-bc61-5919a528f3e1","Type":"ContainerStarted","Data":"fb4612c510a256041b828ba6f8e26eadc9c6591194b4c5eff51cc9ce96edda21"} Apr 16 08:33:27.266417 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:27.266388 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hsx9c" event={"ID":"2dd4e090-3d55-499c-a3fb-7a04e930e31c","Type":"ContainerStarted","Data":"53142e40a94228504c8be2154b80b147f28f56bd2346a6ff24c6bf8c12447d57"} Apr 16 08:33:27.267704 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:27.267687 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-j7rrw" event={"ID":"3ab93a09-2404-44a9-8381-13976f5a1595","Type":"ContainerStarted","Data":"7b2fb316e61f12028da66757e0c9b16db8f0d23d0e226b72081af9f1594c47ac"} Apr 16 08:33:27.268989 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:27.268962 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mkvns" event={"ID":"ed4414d4-f963-4c14-87d5-738798aeb287","Type":"ContainerStarted","Data":"8562f13074c2df3826103e42a08e516bf7482dd4945b155fb4e51461101590aa"} Apr 16 08:33:27.301033 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:27.300997 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-j7rrw" podStartSLOduration=3.31848201 podStartE2EDuration="20.300985711s" podCreationTimestamp="2026-04-16 08:33:07 +0000 UTC" firstStartedPulling="2026-04-16 08:33:09.832004153 +0000 UTC m=+3.239043100" lastFinishedPulling="2026-04-16 08:33:26.81450786 +0000 UTC m=+20.221546801" observedRunningTime="2026-04-16 08:33:27.300451789 +0000 UTC m=+20.707490752" watchObservedRunningTime="2026-04-16 08:33:27.300985711 +0000 UTC m=+20.708024675" Apr 16 08:33:27.301175 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:27.301157 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6fwv8" podStartSLOduration=3.349389153 podStartE2EDuration="20.301153399s" podCreationTimestamp="2026-04-16 08:33:07 +0000 UTC" firstStartedPulling="2026-04-16 08:33:09.831349274 +0000 UTC m=+3.238388216" lastFinishedPulling="2026-04-16 08:33:26.783113507 +0000 UTC m=+20.190152462" observedRunningTime="2026-04-16 08:33:27.280816705 +0000 UTC m=+20.687855665" watchObservedRunningTime="2026-04-16 08:33:27.301153399 +0000 UTC m=+20.708192362" Apr 16 08:33:27.320542 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:27.320497 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-t6g8t" podStartSLOduration=3.323704106 podStartE2EDuration="20.320481305s" podCreationTimestamp="2026-04-16 08:33:07 +0000 UTC" firstStartedPulling="2026-04-16 08:33:09.824804482 +0000 UTC m=+3.231843423" lastFinishedPulling="2026-04-16 08:33:26.821581679 +0000 UTC m=+20.228620622" observedRunningTime="2026-04-16 08:33:27.320285689 +0000 UTC m=+20.727324651" watchObservedRunningTime="2026-04-16 08:33:27.320481305 +0000 UTC m=+20.727520300" Apr 16 08:33:27.373212 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:27.373085 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hsx9c" podStartSLOduration=7.871363372 podStartE2EDuration="20.373069319s" podCreationTimestamp="2026-04-16 08:33:07 +0000 UTC" firstStartedPulling="2026-04-16 08:33:09.833163604 +0000 UTC m=+3.240202558" lastFinishedPulling="2026-04-16 08:33:22.334869549 +0000 UTC m=+15.741908505" observedRunningTime="2026-04-16 08:33:27.340434629 +0000 UTC m=+20.747473595" watchObservedRunningTime="2026-04-16 08:33:27.373069319 +0000 UTC m=+20.780108771" Apr 16 08:33:27.397713 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:27.397663 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mkvns" podStartSLOduration=3.406062547 podStartE2EDuration="20.39764697s" podCreationTimestamp="2026-04-16 08:33:07 +0000 UTC" firstStartedPulling="2026-04-16 08:33:09.830362483 +0000 UTC m=+3.237401424" lastFinishedPulling="2026-04-16 08:33:26.821946906 +0000 UTC m=+20.228985847" observedRunningTime="2026-04-16 08:33:27.396673469 +0000 UTC m=+20.803712431" watchObservedRunningTime="2026-04-16 08:33:27.39764697 +0000 UTC m=+20.804685936" Apr 16 08:33:27.460069 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:27.460035 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fca15501-2323-4079-b35a-57d66f40d0b1-original-pull-secret\") pod \"global-pull-secret-syncer-ktjq7\" (UID: \"fca15501-2323-4079-b35a-57d66f40d0b1\") " pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:27.460237 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:27.460148 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:27.460237 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:27.460203 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fca15501-2323-4079-b35a-57d66f40d0b1-original-pull-secret podName:fca15501-2323-4079-b35a-57d66f40d0b1 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:43.460185881 +0000 UTC m=+36.867224839 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fca15501-2323-4079-b35a-57d66f40d0b1-original-pull-secret") pod "global-pull-secret-syncer-ktjq7" (UID: "fca15501-2323-4079-b35a-57d66f40d0b1") : object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:28.272088 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:28.272009 2579 generic.go:358] "Generic (PLEG): container finished" podID="7330bfa4-019f-4fdd-bc61-5919a528f3e1" containerID="fb4612c510a256041b828ba6f8e26eadc9c6591194b4c5eff51cc9ce96edda21" exitCode=0 Apr 16 08:33:28.272697 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:28.272103 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6cdbx" event={"ID":"7330bfa4-019f-4fdd-bc61-5919a528f3e1","Type":"ContainerDied","Data":"fb4612c510a256041b828ba6f8e26eadc9c6591194b4c5eff51cc9ce96edda21"} Apr 16 08:33:28.274316 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:28.274298 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 08:33:28.274652 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:28.274632 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m577d" event={"ID":"09a74c48-2de1-498a-893b-0fa1b8dbd0dd","Type":"ContainerStarted","Data":"b073e448ed1cbc4d623cbcfd2725827508ed7af28e1b6e1c9709788a9ef08b48"} Apr 16 08:33:28.274730 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:28.274659 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m577d" event={"ID":"09a74c48-2de1-498a-893b-0fa1b8dbd0dd","Type":"ContainerStarted","Data":"60c06cb137a617ab547781c0d7b4dd318ea13d0ad0fa7e0bc139dffcc7e26b29"} Apr 16 08:33:28.274730 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:28.274669 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m577d" event={"ID":"09a74c48-2de1-498a-893b-0fa1b8dbd0dd","Type":"ContainerStarted","Data":"eea0dd292253d118e040e0014cde8b6fdb59df8e1243ebc5e96317926de197e4"} Apr 16 08:33:28.275738 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:28.275710 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fpxkh" event={"ID":"da769e7e-5234-4962-b0d5-107292fb0b9f","Type":"ContainerStarted","Data":"e95dded550dffb6d8fe317c487fa37eb87818c978407e9489921e602d2dae699"} Apr 16 08:33:28.320387 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:28.320346 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-fpxkh" podStartSLOduration=4.406743741 podStartE2EDuration="21.320334646s" podCreationTimestamp="2026-04-16 08:33:07 +0000 UTC" firstStartedPulling="2026-04-16 08:33:09.828419973 +0000 UTC m=+3.235458915" lastFinishedPulling="2026-04-16 08:33:26.74201088 +0000 UTC m=+20.149049820" observedRunningTime="2026-04-16 08:33:28.320265859 +0000 UTC m=+21.727304816" watchObservedRunningTime="2026-04-16 08:33:28.320334646 +0000 UTC m=+21.727373608" Apr 16 08:33:28.535150 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:28.535127 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 08:33:29.168059 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:29.167954 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T08:33:28.535144858Z","UUID":"f91567bf-eb68-478d-a2c8-cd3defba0a12","Handler":null,"Name":"","Endpoint":""} Apr 16 08:33:29.169781 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:29.169760 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 08:33:29.169908 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:29.169791 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 08:33:29.188468 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:29.188438 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:29.188599 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:29.188438 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:29.188599 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:29.188553 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vdxz4" podUID="53e35ca8-ec77-48b7-8e96-ae73f7083c85" Apr 16 08:33:29.188720 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:29.188445 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:29.188720 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:29.188634 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ktjq7" podUID="fca15501-2323-4079-b35a-57d66f40d0b1" Apr 16 08:33:29.188806 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:29.188772 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntc92" podUID="3b6d17c9-2d51-47bc-9e36-95fb034872cb" Apr 16 08:33:29.279909 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:29.279851 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" event={"ID":"0508d934-e625-43c2-940e-b3bcd7809d6d","Type":"ContainerStarted","Data":"2cb4a40ecd9c6e6e11d8527556ff3cbe62bbb3107c7d0a1df201ea73ccd17e06"} Apr 16 08:33:29.403670 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:29.403623 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-j7rrw" Apr 16 08:33:30.284538 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:30.284286 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" event={"ID":"0508d934-e625-43c2-940e-b3bcd7809d6d","Type":"ContainerStarted","Data":"3bf50478f6fdb24e0d0ac7d2f91059a1b77b660a2371336354bfc12887562d18"} Apr 16 08:33:30.287481 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:30.287463 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 08:33:30.287820 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:30.287799 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m577d" event={"ID":"09a74c48-2de1-498a-893b-0fa1b8dbd0dd","Type":"ContainerStarted","Data":"4425e1dc47b72396ff4d6da99c11a29c91f798c8932a8ec9d3f25ead477ca2f9"} Apr 16 08:33:30.303204 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:30.303156 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4mslc" podStartSLOduration=3.224296548 podStartE2EDuration="23.303139242s" podCreationTimestamp="2026-04-16 08:33:07 +0000 UTC" firstStartedPulling="2026-04-16 08:33:09.836757058 +0000 UTC m=+3.243795998" lastFinishedPulling="2026-04-16 08:33:29.915599738 +0000 UTC m=+23.322638692" observedRunningTime="2026-04-16 08:33:30.302229388 +0000 UTC m=+23.709268350" watchObservedRunningTime="2026-04-16 08:33:30.303139242 +0000 UTC m=+23.710178206" Apr 16 08:33:30.689215 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:30.689178 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-j7rrw" Apr 16 08:33:30.689843 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:30.689817 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-j7rrw" Apr 16 08:33:31.184700 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:31.184668 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:31.184861 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:31.184667 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:31.184861 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:31.184812 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vdxz4" podUID="53e35ca8-ec77-48b7-8e96-ae73f7083c85" Apr 16 08:33:31.185031 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:31.184861 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntc92" podUID="3b6d17c9-2d51-47bc-9e36-95fb034872cb" Apr 16 08:33:31.189268 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:31.189246 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:31.189373 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:31.189344 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ktjq7" podUID="fca15501-2323-4079-b35a-57d66f40d0b1" Apr 16 08:33:31.290762 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:31.290736 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-j7rrw" Apr 16 08:33:33.185096 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:33.184839 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:33.185704 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:33.184845 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:33.185704 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:33.184859 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:33.185704 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:33.185239 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vdxz4" podUID="53e35ca8-ec77-48b7-8e96-ae73f7083c85" Apr 16 08:33:33.185704 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:33.185114 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntc92" podUID="3b6d17c9-2d51-47bc-9e36-95fb034872cb" Apr 16 08:33:33.185704 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:33.185291 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ktjq7" podUID="fca15501-2323-4079-b35a-57d66f40d0b1" Apr 16 08:33:33.296256 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:33.296231 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 08:33:33.296555 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:33.296533 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m577d" event={"ID":"09a74c48-2de1-498a-893b-0fa1b8dbd0dd","Type":"ContainerStarted","Data":"4b9a9dacd6b44fca2380607044ff8b4fe6be8f69a0f37b792e010f9cbdc0c140"} Apr 16 08:33:33.296807 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:33.296786 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:33.296932 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:33.296812 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:33.296932 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:33.296825 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:33.297073 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:33.297057 2579 scope.go:117] "RemoveContainer" containerID="49e557d613f81d02ce2e01056daa04394b4bde789f34fc784bcffc73f8edb81b" Apr 16 08:33:33.298563 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:33.298535 2579 generic.go:358] "Generic (PLEG): container finished" podID="7330bfa4-019f-4fdd-bc61-5919a528f3e1" containerID="468ee5f42ba1bbb39d98eb1087416c7d003ba7e171930a7b561f7496f6af6e99" exitCode=0 Apr 16 08:33:33.298655 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:33.298581 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6cdbx" event={"ID":"7330bfa4-019f-4fdd-bc61-5919a528f3e1","Type":"ContainerDied","Data":"468ee5f42ba1bbb39d98eb1087416c7d003ba7e171930a7b561f7496f6af6e99"} Apr 16 08:33:33.312568 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:33.312550 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:33.312678 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:33.312657 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:33:34.305272 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:34.305239 2579 generic.go:358] "Generic (PLEG): container finished" podID="7330bfa4-019f-4fdd-bc61-5919a528f3e1" containerID="38afa9c6cea4e9a08c775cd626b816d4209d0ef0d32719d81ba6e0a73176672a" exitCode=0 Apr 16 08:33:34.305670 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:34.305308 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6cdbx" event={"ID":"7330bfa4-019f-4fdd-bc61-5919a528f3e1","Type":"ContainerDied","Data":"38afa9c6cea4e9a08c775cd626b816d4209d0ef0d32719d81ba6e0a73176672a"} Apr 16 08:33:34.308754 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:34.308734 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 08:33:34.309084 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:34.309066 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m577d" event={"ID":"09a74c48-2de1-498a-893b-0fa1b8dbd0dd","Type":"ContainerStarted","Data":"7390f5f5b79472822deb720c3c78a414bbb61380c4f63674866be33128e2b687"} Apr 16 08:33:34.356315 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:34.356230 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-m577d" podStartSLOduration=10.298101063 podStartE2EDuration="27.356217407s" podCreationTimestamp="2026-04-16 08:33:07 +0000 UTC" firstStartedPulling="2026-04-16 08:33:09.829106967 +0000 UTC m=+3.236145908" lastFinishedPulling="2026-04-16 08:33:26.88722331 +0000 UTC m=+20.294262252" observedRunningTime="2026-04-16 08:33:34.354553758 +0000 UTC m=+27.761592894" watchObservedRunningTime="2026-04-16 08:33:34.356217407 +0000 UTC m=+27.763256398" Apr 16 08:33:34.680674 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:34.680605 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ktjq7"] Apr 16 08:33:34.680804 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:34.680730 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:34.680840 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:34.680815 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ktjq7" podUID="fca15501-2323-4079-b35a-57d66f40d0b1" Apr 16 08:33:34.683853 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:34.683829 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vdxz4"] Apr 16 08:33:34.683977 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:34.683942 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:34.684031 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:34.684015 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vdxz4" podUID="53e35ca8-ec77-48b7-8e96-ae73f7083c85" Apr 16 08:33:34.693291 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:34.693271 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ntc92"] Apr 16 08:33:34.693376 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:34.693363 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:34.693448 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:34.693434 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntc92" podUID="3b6d17c9-2d51-47bc-9e36-95fb034872cb" Apr 16 08:33:35.313562 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:35.313530 2579 generic.go:358] "Generic (PLEG): container finished" podID="7330bfa4-019f-4fdd-bc61-5919a528f3e1" containerID="b6713ed6c1f3d0ade1cd52e47876ce306a4791e5486f3e4ae163e3824902210e" exitCode=0 Apr 16 08:33:35.314034 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:35.313625 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6cdbx" event={"ID":"7330bfa4-019f-4fdd-bc61-5919a528f3e1","Type":"ContainerDied","Data":"b6713ed6c1f3d0ade1cd52e47876ce306a4791e5486f3e4ae163e3824902210e"} Apr 16 08:33:36.184714 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:36.184638 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:36.184907 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:36.184638 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:36.184907 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:36.184764 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntc92" podUID="3b6d17c9-2d51-47bc-9e36-95fb034872cb" Apr 16 08:33:36.184907 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:36.184638 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:36.184907 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:36.184836 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ktjq7" podUID="fca15501-2323-4079-b35a-57d66f40d0b1" Apr 16 08:33:36.185096 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:36.184922 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vdxz4" podUID="53e35ca8-ec77-48b7-8e96-ae73f7083c85" Apr 16 08:33:38.185384 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:38.185351 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:38.185384 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:38.185374 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:38.185934 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:38.185399 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:38.185934 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:38.185498 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ntc92" podUID="3b6d17c9-2d51-47bc-9e36-95fb034872cb" Apr 16 08:33:38.185934 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:38.185646 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ktjq7" podUID="fca15501-2323-4079-b35a-57d66f40d0b1" Apr 16 08:33:38.185934 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:38.185765 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vdxz4" podUID="53e35ca8-ec77-48b7-8e96-ae73f7083c85" Apr 16 08:33:39.930721 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:39.930647 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-41.ec2.internal" event="NodeReady" Apr 16 08:33:39.931248 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:39.930768 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 08:33:39.983342 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:39.983310 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-flzpm"] Apr 16 08:33:39.987464 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:39.987437 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-b9rrh"] Apr 16 08:33:39.987637 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:39.987611 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-flzpm" Apr 16 08:33:39.990160 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:39.989875 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 08:33:39.990160 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:39.989905 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-l2kgt\"" Apr 16 08:33:39.990348 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:39.990201 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 08:33:39.990484 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:39.990457 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b9rrh" Apr 16 08:33:39.992556 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:39.992536 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 08:33:39.992658 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:39.992613 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 08:33:39.993074 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:39.993056 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 08:33:39.993168 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:39.993151 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kp7vt\"" Apr 16 08:33:39.994471 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:39.994449 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b9rrh"] Apr 16 08:33:39.997993 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:39.997972 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-flzpm"] Apr 16 08:33:40.065096 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.065061 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7025f860-4946-43aa-9ebe-7d45f5616858-tmp-dir\") pod \"dns-default-flzpm\" (UID: \"7025f860-4946-43aa-9ebe-7d45f5616858\") " pod="openshift-dns/dns-default-flzpm" Apr 16 08:33:40.065286 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.065122 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tbqd\" (UniqueName: \"kubernetes.io/projected/7025f860-4946-43aa-9ebe-7d45f5616858-kube-api-access-6tbqd\") pod \"dns-default-flzpm\" (UID: \"7025f860-4946-43aa-9ebe-7d45f5616858\") " pod="openshift-dns/dns-default-flzpm" Apr 16 08:33:40.065286 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.065191 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls\") pod \"dns-default-flzpm\" (UID: \"7025f860-4946-43aa-9ebe-7d45f5616858\") " pod="openshift-dns/dns-default-flzpm" Apr 16 08:33:40.065286 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.065244 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7025f860-4946-43aa-9ebe-7d45f5616858-config-volume\") pod \"dns-default-flzpm\" (UID: \"7025f860-4946-43aa-9ebe-7d45f5616858\") " pod="openshift-dns/dns-default-flzpm" Apr 16 08:33:40.166054 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.166012 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tbqd\" (UniqueName: \"kubernetes.io/projected/7025f860-4946-43aa-9ebe-7d45f5616858-kube-api-access-6tbqd\") pod \"dns-default-flzpm\" (UID: \"7025f860-4946-43aa-9ebe-7d45f5616858\") " pod="openshift-dns/dns-default-flzpm" Apr 16 08:33:40.166054 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.166059 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert\") pod \"ingress-canary-b9rrh\" (UID: \"06b0854a-ae72-4622-b5bd-6803c4a6d119\") " pod="openshift-ingress-canary/ingress-canary-b9rrh" Apr 16 08:33:40.166277 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.166100 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls\") pod \"dns-default-flzpm\" (UID: \"7025f860-4946-43aa-9ebe-7d45f5616858\") " pod="openshift-dns/dns-default-flzpm" Apr 16 08:33:40.166277 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.166123 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz6t8\" (UniqueName: \"kubernetes.io/projected/06b0854a-ae72-4622-b5bd-6803c4a6d119-kube-api-access-vz6t8\") pod \"ingress-canary-b9rrh\" (UID: \"06b0854a-ae72-4622-b5bd-6803c4a6d119\") " pod="openshift-ingress-canary/ingress-canary-b9rrh" Apr 16 08:33:40.166277 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:40.166226 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:33:40.166277 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.166251 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7025f860-4946-43aa-9ebe-7d45f5616858-config-volume\") pod \"dns-default-flzpm\" (UID: \"7025f860-4946-43aa-9ebe-7d45f5616858\") " pod="openshift-dns/dns-default-flzpm" Apr 16 08:33:40.166472 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:40.166293 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls podName:7025f860-4946-43aa-9ebe-7d45f5616858 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:40.66627132 +0000 UTC m=+34.073310262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls") pod "dns-default-flzpm" (UID: "7025f860-4946-43aa-9ebe-7d45f5616858") : secret "dns-default-metrics-tls" not found Apr 16 08:33:40.166472 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.166350 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7025f860-4946-43aa-9ebe-7d45f5616858-tmp-dir\") pod \"dns-default-flzpm\" (UID: \"7025f860-4946-43aa-9ebe-7d45f5616858\") " pod="openshift-dns/dns-default-flzpm" Apr 16 08:33:40.166661 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.166642 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7025f860-4946-43aa-9ebe-7d45f5616858-tmp-dir\") pod \"dns-default-flzpm\" (UID: \"7025f860-4946-43aa-9ebe-7d45f5616858\") " pod="openshift-dns/dns-default-flzpm" Apr 16 08:33:40.166722 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.166677 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7025f860-4946-43aa-9ebe-7d45f5616858-config-volume\") pod \"dns-default-flzpm\" (UID: \"7025f860-4946-43aa-9ebe-7d45f5616858\") " pod="openshift-dns/dns-default-flzpm" Apr 16 08:33:40.177753 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.177720 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tbqd\" (UniqueName: \"kubernetes.io/projected/7025f860-4946-43aa-9ebe-7d45f5616858-kube-api-access-6tbqd\") pod \"dns-default-flzpm\" (UID: \"7025f860-4946-43aa-9ebe-7d45f5616858\") " pod="openshift-dns/dns-default-flzpm" Apr 16 08:33:40.184518 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.184467 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:40.184623 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.184467 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:40.184745 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.184471 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:40.187545 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.187525 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 08:33:40.187664 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.187636 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kd4nk\"" Apr 16 08:33:40.187730 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.187688 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 08:33:40.187780 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.187730 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-blgdz\"" Apr 16 08:33:40.188077 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.188058 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 08:33:40.188164 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.188078 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 08:33:40.266972 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.266933 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vz6t8\" (UniqueName: \"kubernetes.io/projected/06b0854a-ae72-4622-b5bd-6803c4a6d119-kube-api-access-vz6t8\") pod \"ingress-canary-b9rrh\" (UID: \"06b0854a-ae72-4622-b5bd-6803c4a6d119\") " pod="openshift-ingress-canary/ingress-canary-b9rrh" Apr 16 08:33:40.267122 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.267053 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert\") pod \"ingress-canary-b9rrh\" (UID: \"06b0854a-ae72-4622-b5bd-6803c4a6d119\") " pod="openshift-ingress-canary/ingress-canary-b9rrh" Apr 16 08:33:40.267179 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:40.267168 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:33:40.267243 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:40.267231 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert podName:06b0854a-ae72-4622-b5bd-6803c4a6d119 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:40.767211586 +0000 UTC m=+34.174250531 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert") pod "ingress-canary-b9rrh" (UID: "06b0854a-ae72-4622-b5bd-6803c4a6d119") : secret "canary-serving-cert" not found Apr 16 08:33:40.276735 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.276707 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz6t8\" (UniqueName: \"kubernetes.io/projected/06b0854a-ae72-4622-b5bd-6803c4a6d119-kube-api-access-vz6t8\") pod \"ingress-canary-b9rrh\" (UID: \"06b0854a-ae72-4622-b5bd-6803c4a6d119\") " pod="openshift-ingress-canary/ingress-canary-b9rrh" Apr 16 08:33:40.671189 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.671152 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls\") pod \"dns-default-flzpm\" (UID: \"7025f860-4946-43aa-9ebe-7d45f5616858\") " pod="openshift-dns/dns-default-flzpm" Apr 16 08:33:40.671408 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:40.671280 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:33:40.671408 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:40.671339 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls podName:7025f860-4946-43aa-9ebe-7d45f5616858 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:41.671319362 +0000 UTC m=+35.078358304 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls") pod "dns-default-flzpm" (UID: "7025f860-4946-43aa-9ebe-7d45f5616858") : secret "dns-default-metrics-tls" not found Apr 16 08:33:40.772301 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.772268 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert\") pod \"ingress-canary-b9rrh\" (UID: \"06b0854a-ae72-4622-b5bd-6803c4a6d119\") " pod="openshift-ingress-canary/ingress-canary-b9rrh" Apr 16 08:33:40.772481 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:40.772434 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:33:40.772546 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:40.772513 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert podName:06b0854a-ae72-4622-b5bd-6803c4a6d119 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:41.772491152 +0000 UTC m=+35.179530093 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert") pod "ingress-canary-b9rrh" (UID: "06b0854a-ae72-4622-b5bd-6803c4a6d119") : secret "canary-serving-cert" not found Apr 16 08:33:40.973385 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.973317 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs\") pod \"network-metrics-daemon-vdxz4\" (UID: \"53e35ca8-ec77-48b7-8e96-ae73f7083c85\") " pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:33:40.973385 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.973355 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7886z\" (UniqueName: \"kubernetes.io/projected/3b6d17c9-2d51-47bc-9e36-95fb034872cb-kube-api-access-7886z\") pod \"network-check-target-ntc92\" (UID: \"3b6d17c9-2d51-47bc-9e36-95fb034872cb\") " pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:40.973955 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:40.973477 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 08:33:40.973955 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:40.973544 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs podName:53e35ca8-ec77-48b7-8e96-ae73f7083c85 nodeName:}" failed. No retries permitted until 2026-04-16 08:34:12.973524588 +0000 UTC m=+66.380563534 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs") pod "network-metrics-daemon-vdxz4" (UID: "53e35ca8-ec77-48b7-8e96-ae73f7083c85") : secret "metrics-daemon-secret" not found Apr 16 08:33:40.976101 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:40.976084 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7886z\" (UniqueName: \"kubernetes.io/projected/3b6d17c9-2d51-47bc-9e36-95fb034872cb-kube-api-access-7886z\") pod \"network-check-target-ntc92\" (UID: \"3b6d17c9-2d51-47bc-9e36-95fb034872cb\") " pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:41.095557 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:41.095524 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:41.421954 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:41.421806 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ntc92"] Apr 16 08:33:41.424868 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:41.424838 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6d17c9_2d51_47bc_9e36_95fb034872cb.slice/crio-df855e1968cc210612ef727be1dc6c0450fa286a0803b9981c9415ec42545232 WatchSource:0}: Error finding container df855e1968cc210612ef727be1dc6c0450fa286a0803b9981c9415ec42545232: Status 404 returned error can't find the container with id df855e1968cc210612ef727be1dc6c0450fa286a0803b9981c9415ec42545232 Apr 16 08:33:41.678571 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:41.678537 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls\") pod \"dns-default-flzpm\" (UID: \"7025f860-4946-43aa-9ebe-7d45f5616858\") " pod="openshift-dns/dns-default-flzpm" Apr 16 08:33:41.678693 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:41.678673 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:33:41.678740 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:41.678732 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls podName:7025f860-4946-43aa-9ebe-7d45f5616858 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:43.678717428 +0000 UTC m=+37.085756369 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls") pod "dns-default-flzpm" (UID: "7025f860-4946-43aa-9ebe-7d45f5616858") : secret "dns-default-metrics-tls" not found Apr 16 08:33:41.779648 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:41.779614 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert\") pod \"ingress-canary-b9rrh\" (UID: \"06b0854a-ae72-4622-b5bd-6803c4a6d119\") " pod="openshift-ingress-canary/ingress-canary-b9rrh" Apr 16 08:33:41.779778 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:41.779754 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:33:41.779878 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:41.779810 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert podName:06b0854a-ae72-4622-b5bd-6803c4a6d119 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:43.779796407 +0000 UTC m=+37.186835348 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert") pod "ingress-canary-b9rrh" (UID: "06b0854a-ae72-4622-b5bd-6803c4a6d119") : secret "canary-serving-cert" not found Apr 16 08:33:42.342977 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:42.342941 2579 generic.go:358] "Generic (PLEG): container finished" podID="7330bfa4-019f-4fdd-bc61-5919a528f3e1" containerID="028d78b4672d922984ce33a9f8d808b90898ee61eafc10d71812f65dcd2bc557" exitCode=0 Apr 16 08:33:42.343526 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:42.343019 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6cdbx" event={"ID":"7330bfa4-019f-4fdd-bc61-5919a528f3e1","Type":"ContainerDied","Data":"028d78b4672d922984ce33a9f8d808b90898ee61eafc10d71812f65dcd2bc557"} Apr 16 08:33:42.344661 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:42.344636 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ntc92" event={"ID":"3b6d17c9-2d51-47bc-9e36-95fb034872cb","Type":"ContainerStarted","Data":"df855e1968cc210612ef727be1dc6c0450fa286a0803b9981c9415ec42545232"} Apr 16 08:33:43.350063 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:43.350031 2579 generic.go:358] "Generic (PLEG): container finished" podID="7330bfa4-019f-4fdd-bc61-5919a528f3e1" containerID="7d4519faeeb379c5d06af15dd6c7e25ab8c84380b4f9c9e8b490aed38163b342" exitCode=0 Apr 16 08:33:43.350545 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:43.350088 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6cdbx" event={"ID":"7330bfa4-019f-4fdd-bc61-5919a528f3e1","Type":"ContainerDied","Data":"7d4519faeeb379c5d06af15dd6c7e25ab8c84380b4f9c9e8b490aed38163b342"} Apr 16 08:33:43.494251 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:43.494209 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fca15501-2323-4079-b35a-57d66f40d0b1-original-pull-secret\") pod \"global-pull-secret-syncer-ktjq7\" (UID: \"fca15501-2323-4079-b35a-57d66f40d0b1\") " pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:43.498604 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:43.498578 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fca15501-2323-4079-b35a-57d66f40d0b1-original-pull-secret\") pod \"global-pull-secret-syncer-ktjq7\" (UID: \"fca15501-2323-4079-b35a-57d66f40d0b1\") " pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:43.502379 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:43.502355 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ktjq7" Apr 16 08:33:43.695676 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:43.695592 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls\") pod \"dns-default-flzpm\" (UID: \"7025f860-4946-43aa-9ebe-7d45f5616858\") " pod="openshift-dns/dns-default-flzpm" Apr 16 08:33:43.695826 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:43.695716 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:33:43.695826 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:43.695783 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls podName:7025f860-4946-43aa-9ebe-7d45f5616858 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:47.695763622 +0000 UTC m=+41.102802586 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls") pod "dns-default-flzpm" (UID: "7025f860-4946-43aa-9ebe-7d45f5616858") : secret "dns-default-metrics-tls" not found Apr 16 08:33:43.795969 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:43.795934 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert\") pod \"ingress-canary-b9rrh\" (UID: \"06b0854a-ae72-4622-b5bd-6803c4a6d119\") " pod="openshift-ingress-canary/ingress-canary-b9rrh" Apr 16 08:33:43.796124 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:43.796077 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:33:43.796182 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:43.796137 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert podName:06b0854a-ae72-4622-b5bd-6803c4a6d119 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:47.796123968 +0000 UTC m=+41.203162910 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert") pod "ingress-canary-b9rrh" (UID: "06b0854a-ae72-4622-b5bd-6803c4a6d119") : secret "canary-serving-cert" not found Apr 16 08:33:44.266930 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:44.266771 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ktjq7"] Apr 16 08:33:44.269993 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:33:44.269955 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfca15501_2323_4079_b35a_57d66f40d0b1.slice/crio-89d801f426dd1e2bac75c1a5919967ccbd25be98606df0beb5ef85f4e26c454c WatchSource:0}: Error finding container 89d801f426dd1e2bac75c1a5919967ccbd25be98606df0beb5ef85f4e26c454c: Status 404 returned error can't find the container with id 89d801f426dd1e2bac75c1a5919967ccbd25be98606df0beb5ef85f4e26c454c Apr 16 08:33:44.353268 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:44.353242 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ktjq7" event={"ID":"fca15501-2323-4079-b35a-57d66f40d0b1","Type":"ContainerStarted","Data":"89d801f426dd1e2bac75c1a5919967ccbd25be98606df0beb5ef85f4e26c454c"} Apr 16 08:33:44.355987 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:44.355968 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6cdbx" event={"ID":"7330bfa4-019f-4fdd-bc61-5919a528f3e1","Type":"ContainerStarted","Data":"f26a97605f1fd17ae6ec9e03e1325545d6c66ffe9e107863295d660820c05c84"} Apr 16 08:33:44.383305 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:44.383258 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6cdbx" podStartSLOduration=5.972387124 podStartE2EDuration="37.383246985s" podCreationTimestamp="2026-04-16 08:33:07 +0000 UTC" firstStartedPulling="2026-04-16 08:33:09.835576283 +0000 UTC m=+3.242615239" lastFinishedPulling="2026-04-16 08:33:41.246436147 +0000 UTC m=+34.653475100" observedRunningTime="2026-04-16 08:33:44.381591433 +0000 UTC m=+37.788630390" watchObservedRunningTime="2026-04-16 08:33:44.383246985 +0000 UTC m=+37.790285948" Apr 16 08:33:45.359676 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:45.359599 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ntc92" event={"ID":"3b6d17c9-2d51-47bc-9e36-95fb034872cb","Type":"ContainerStarted","Data":"33a9c5018ee99dda206a5e6bb00e825e5be50ae4430e14727a65abc66c7c81b2"} Apr 16 08:33:45.359676 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:45.359647 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:33:45.378003 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:45.377964 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ntc92" podStartSLOduration=35.322556598 podStartE2EDuration="38.377951484s" podCreationTimestamp="2026-04-16 08:33:07 +0000 UTC" firstStartedPulling="2026-04-16 08:33:41.42693899 +0000 UTC m=+34.833977931" lastFinishedPulling="2026-04-16 08:33:44.482333863 +0000 UTC m=+37.889372817" observedRunningTime="2026-04-16 08:33:45.376352059 +0000 UTC m=+38.783391022" watchObservedRunningTime="2026-04-16 08:33:45.377951484 +0000 UTC m=+38.784990480" Apr 16 08:33:47.725469 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:47.725435 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls\") pod \"dns-default-flzpm\" (UID: \"7025f860-4946-43aa-9ebe-7d45f5616858\") " pod="openshift-dns/dns-default-flzpm" Apr 16 08:33:47.725874 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:47.725609 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:33:47.725874 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:47.725693 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls podName:7025f860-4946-43aa-9ebe-7d45f5616858 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:55.725670348 +0000 UTC m=+49.132709288 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls") pod "dns-default-flzpm" (UID: "7025f860-4946-43aa-9ebe-7d45f5616858") : secret "dns-default-metrics-tls" not found Apr 16 08:33:47.826167 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:47.826131 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert\") pod \"ingress-canary-b9rrh\" (UID: \"06b0854a-ae72-4622-b5bd-6803c4a6d119\") " pod="openshift-ingress-canary/ingress-canary-b9rrh" Apr 16 08:33:47.826348 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:47.826260 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:33:47.826348 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:47.826326 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert podName:06b0854a-ae72-4622-b5bd-6803c4a6d119 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:55.826306839 +0000 UTC m=+49.233345803 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert") pod "ingress-canary-b9rrh" (UID: "06b0854a-ae72-4622-b5bd-6803c4a6d119") : secret "canary-serving-cert" not found Apr 16 08:33:49.367339 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:49.367297 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ktjq7" event={"ID":"fca15501-2323-4079-b35a-57d66f40d0b1","Type":"ContainerStarted","Data":"69afe98fdeb1170d25b49843331259ff0b7d0af979a2fd61779538da834d4adc"} Apr 16 08:33:49.384414 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:49.384359 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-ktjq7" podStartSLOduration=34.375645938 podStartE2EDuration="38.384346465s" podCreationTimestamp="2026-04-16 08:33:11 +0000 UTC" firstStartedPulling="2026-04-16 08:33:44.272457073 +0000 UTC m=+37.679496015" lastFinishedPulling="2026-04-16 08:33:48.281157587 +0000 UTC m=+41.688196542" observedRunningTime="2026-04-16 08:33:49.38414656 +0000 UTC m=+42.791185526" watchObservedRunningTime="2026-04-16 08:33:49.384346465 +0000 UTC m=+42.791385428" Apr 16 08:33:55.776874 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:55.776830 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls\") pod \"dns-default-flzpm\" (UID: \"7025f860-4946-43aa-9ebe-7d45f5616858\") " pod="openshift-dns/dns-default-flzpm" Apr 16 08:33:55.777311 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:55.777012 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:33:55.777311 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:55.777078 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls podName:7025f860-4946-43aa-9ebe-7d45f5616858 nodeName:}" failed. No retries permitted until 2026-04-16 08:34:11.777059054 +0000 UTC m=+65.184097995 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls") pod "dns-default-flzpm" (UID: "7025f860-4946-43aa-9ebe-7d45f5616858") : secret "dns-default-metrics-tls" not found Apr 16 08:33:55.877421 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:33:55.877389 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert\") pod \"ingress-canary-b9rrh\" (UID: \"06b0854a-ae72-4622-b5bd-6803c4a6d119\") " pod="openshift-ingress-canary/ingress-canary-b9rrh" Apr 16 08:33:55.877555 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:55.877533 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:33:55.877606 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:33:55.877596 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert podName:06b0854a-ae72-4622-b5bd-6803c4a6d119 nodeName:}" failed. No retries permitted until 2026-04-16 08:34:11.87757646 +0000 UTC m=+65.284615403 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert") pod "ingress-canary-b9rrh" (UID: "06b0854a-ae72-4622-b5bd-6803c4a6d119") : secret "canary-serving-cert" not found Apr 16 08:34:05.325317 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:34:05.325284 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m577d" Apr 16 08:34:11.782304 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:34:11.782267 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls\") pod \"dns-default-flzpm\" (UID: \"7025f860-4946-43aa-9ebe-7d45f5616858\") " pod="openshift-dns/dns-default-flzpm" Apr 16 08:34:11.782820 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:34:11.782434 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:34:11.782820 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:34:11.782512 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls podName:7025f860-4946-43aa-9ebe-7d45f5616858 nodeName:}" failed. No retries permitted until 2026-04-16 08:34:43.782495434 +0000 UTC m=+97.189534374 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls") pod "dns-default-flzpm" (UID: "7025f860-4946-43aa-9ebe-7d45f5616858") : secret "dns-default-metrics-tls" not found Apr 16 08:34:11.887030 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:34:11.886992 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert\") pod \"ingress-canary-b9rrh\" (UID: \"06b0854a-ae72-4622-b5bd-6803c4a6d119\") " pod="openshift-ingress-canary/ingress-canary-b9rrh" Apr 16 08:34:11.887166 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:34:11.887132 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:34:11.887206 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:34:11.887193 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert podName:06b0854a-ae72-4622-b5bd-6803c4a6d119 nodeName:}" failed. No retries permitted until 2026-04-16 08:34:43.887177582 +0000 UTC m=+97.294216523 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert") pod "ingress-canary-b9rrh" (UID: "06b0854a-ae72-4622-b5bd-6803c4a6d119") : secret "canary-serving-cert" not found Apr 16 08:34:12.995020 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:34:12.994964 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs\") pod \"network-metrics-daemon-vdxz4\" (UID: \"53e35ca8-ec77-48b7-8e96-ae73f7083c85\") " pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:34:12.995408 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:34:12.995120 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 08:34:12.995408 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:34:12.995187 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs podName:53e35ca8-ec77-48b7-8e96-ae73f7083c85 nodeName:}" failed. No retries permitted until 2026-04-16 08:35:16.995169981 +0000 UTC m=+130.402208921 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs") pod "network-metrics-daemon-vdxz4" (UID: "53e35ca8-ec77-48b7-8e96-ae73f7083c85") : secret "metrics-daemon-secret" not found Apr 16 08:34:16.363961 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:34:16.363929 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ntc92" Apr 16 08:34:43.810832 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:34:43.810694 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls\") pod \"dns-default-flzpm\" (UID: \"7025f860-4946-43aa-9ebe-7d45f5616858\") " pod="openshift-dns/dns-default-flzpm" Apr 16 08:34:43.810832 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:34:43.810835 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:34:43.811400 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:34:43.810917 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls podName:7025f860-4946-43aa-9ebe-7d45f5616858 nodeName:}" failed. No retries permitted until 2026-04-16 08:35:47.81088281 +0000 UTC m=+161.217921754 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls") pod "dns-default-flzpm" (UID: "7025f860-4946-43aa-9ebe-7d45f5616858") : secret "dns-default-metrics-tls" not found Apr 16 08:34:43.912017 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:34:43.911983 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert\") pod \"ingress-canary-b9rrh\" (UID: \"06b0854a-ae72-4622-b5bd-6803c4a6d119\") " pod="openshift-ingress-canary/ingress-canary-b9rrh" Apr 16 08:34:43.912143 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:34:43.912125 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:34:43.912211 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:34:43.912201 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert podName:06b0854a-ae72-4622-b5bd-6803c4a6d119 nodeName:}" failed. No retries permitted until 2026-04-16 08:35:47.91218561 +0000 UTC m=+161.319224552 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert") pod "ingress-canary-b9rrh" (UID: "06b0854a-ae72-4622-b5bd-6803c4a6d119") : secret "canary-serving-cert" not found Apr 16 08:35:15.272409 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.272373 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-68d648d8c6-gdh8b"] Apr 16 08:35:15.275216 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.275193 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-5dsf6"] Apr 16 08:35:15.275357 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.275341 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:15.278620 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.278598 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 08:35:15.278728 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.278689 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" Apr 16 08:35:15.279031 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.279009 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 08:35:15.279146 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.279029 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 08:35:15.279210 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.279187 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 08:35:15.279210 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.279204 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 08:35:15.279307 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.279212 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 08:35:15.279711 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.279698 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-px2hd\"" Apr 16 08:35:15.280866 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.280833 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 08:35:15.281231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.281202 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 08:35:15.281231 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.281213 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 08:35:15.281433 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.281418 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 08:35:15.281502 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.281490 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-m79qb\"" Apr 16 08:35:15.286273 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.286253 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 08:35:15.287654 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.287632 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-68d648d8c6-gdh8b"] Apr 16 08:35:15.290354 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.290337 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-5dsf6"] Apr 16 08:35:15.419781 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.419747 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-stats-auth\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:15.419781 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.419782 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhktt\" (UniqueName: \"kubernetes.io/projected/e931af40-20ab-4a49-9b92-41dc1be13602-kube-api-access-fhktt\") pod \"console-operator-d87b8d5fc-5dsf6\" (UID: \"e931af40-20ab-4a49-9b92-41dc1be13602\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" Apr 16 08:35:15.420033 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.419810 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-service-ca-bundle\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:15.420033 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.419883 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e931af40-20ab-4a49-9b92-41dc1be13602-config\") pod \"console-operator-d87b8d5fc-5dsf6\" (UID: \"e931af40-20ab-4a49-9b92-41dc1be13602\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" Apr 16 08:35:15.420033 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.419931 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5ggm\" (UniqueName: \"kubernetes.io/projected/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-kube-api-access-z5ggm\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:15.420033 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.419954 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-metrics-certs\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:15.420228 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.420012 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e931af40-20ab-4a49-9b92-41dc1be13602-serving-cert\") pod \"console-operator-d87b8d5fc-5dsf6\" (UID: \"e931af40-20ab-4a49-9b92-41dc1be13602\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" Apr 16 08:35:15.420228 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.420069 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e931af40-20ab-4a49-9b92-41dc1be13602-trusted-ca\") pod \"console-operator-d87b8d5fc-5dsf6\" (UID: \"e931af40-20ab-4a49-9b92-41dc1be13602\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" Apr 16 08:35:15.420228 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.420128 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-default-certificate\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:15.521159 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.521132 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5ggm\" (UniqueName: \"kubernetes.io/projected/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-kube-api-access-z5ggm\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:15.521302 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.521164 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-metrics-certs\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:15.521302 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.521184 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e931af40-20ab-4a49-9b92-41dc1be13602-serving-cert\") pod \"console-operator-d87b8d5fc-5dsf6\" (UID: \"e931af40-20ab-4a49-9b92-41dc1be13602\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" Apr 16 08:35:15.521302 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.521201 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e931af40-20ab-4a49-9b92-41dc1be13602-trusted-ca\") pod \"console-operator-d87b8d5fc-5dsf6\" (UID: \"e931af40-20ab-4a49-9b92-41dc1be13602\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" Apr 16 08:35:15.521302 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:15.521280 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 08:35:15.521495 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:15.521364 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-metrics-certs podName:108ff4c3-fb90-4321-9fcb-90d3aeeab5bf nodeName:}" failed. No retries permitted until 2026-04-16 08:35:16.021341236 +0000 UTC m=+129.428380177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-metrics-certs") pod "router-default-68d648d8c6-gdh8b" (UID: "108ff4c3-fb90-4321-9fcb-90d3aeeab5bf") : secret "router-metrics-certs-default" not found Apr 16 08:35:15.521495 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.521407 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-default-certificate\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:15.521495 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.521488 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-stats-auth\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:15.521649 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.521519 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhktt\" (UniqueName: \"kubernetes.io/projected/e931af40-20ab-4a49-9b92-41dc1be13602-kube-api-access-fhktt\") pod \"console-operator-d87b8d5fc-5dsf6\" (UID: \"e931af40-20ab-4a49-9b92-41dc1be13602\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" Apr 16 08:35:15.521649 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.521557 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-service-ca-bundle\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:15.521649 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.521585 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e931af40-20ab-4a49-9b92-41dc1be13602-config\") pod \"console-operator-d87b8d5fc-5dsf6\" (UID: \"e931af40-20ab-4a49-9b92-41dc1be13602\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" Apr 16 08:35:15.521789 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:15.521728 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-service-ca-bundle podName:108ff4c3-fb90-4321-9fcb-90d3aeeab5bf nodeName:}" failed. No retries permitted until 2026-04-16 08:35:16.021710528 +0000 UTC m=+129.428749472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-service-ca-bundle") pod "router-default-68d648d8c6-gdh8b" (UID: "108ff4c3-fb90-4321-9fcb-90d3aeeab5bf") : configmap references non-existent config key: service-ca.crt Apr 16 08:35:15.522311 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.522286 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e931af40-20ab-4a49-9b92-41dc1be13602-config\") pod \"console-operator-d87b8d5fc-5dsf6\" (UID: \"e931af40-20ab-4a49-9b92-41dc1be13602\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" Apr 16 08:35:15.522311 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.522286 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e931af40-20ab-4a49-9b92-41dc1be13602-trusted-ca\") pod \"console-operator-d87b8d5fc-5dsf6\" (UID: \"e931af40-20ab-4a49-9b92-41dc1be13602\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" Apr 16 08:35:15.524662 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.524604 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e931af40-20ab-4a49-9b92-41dc1be13602-serving-cert\") pod \"console-operator-d87b8d5fc-5dsf6\" (UID: \"e931af40-20ab-4a49-9b92-41dc1be13602\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" Apr 16 08:35:15.524772 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.524756 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-stats-auth\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:15.524813 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.524799 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-default-certificate\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:15.531019 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.530992 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5ggm\" (UniqueName: \"kubernetes.io/projected/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-kube-api-access-z5ggm\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:15.531266 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.531248 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhktt\" (UniqueName: \"kubernetes.io/projected/e931af40-20ab-4a49-9b92-41dc1be13602-kube-api-access-fhktt\") pod \"console-operator-d87b8d5fc-5dsf6\" (UID: \"e931af40-20ab-4a49-9b92-41dc1be13602\") " pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" Apr 16 08:35:15.591986 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.591959 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" Apr 16 08:35:15.712880 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:15.712848 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-5dsf6"] Apr 16 08:35:15.715808 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:35:15.715785 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode931af40_20ab_4a49_9b92_41dc1be13602.slice/crio-2eb00d88c2b243dec273c893beba250b62759df19d7fb7ea55d282a4ee2ea987 WatchSource:0}: Error finding container 2eb00d88c2b243dec273c893beba250b62759df19d7fb7ea55d282a4ee2ea987: Status 404 returned error can't find the container with id 2eb00d88c2b243dec273c893beba250b62759df19d7fb7ea55d282a4ee2ea987 Apr 16 08:35:16.025718 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:16.025689 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-service-ca-bundle\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:16.025880 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:16.025726 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-metrics-certs\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:16.025880 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:16.025821 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 08:35:16.025880 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:16.025849 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-service-ca-bundle podName:108ff4c3-fb90-4321-9fcb-90d3aeeab5bf nodeName:}" failed. No retries permitted until 2026-04-16 08:35:17.025831548 +0000 UTC m=+130.432870492 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-service-ca-bundle") pod "router-default-68d648d8c6-gdh8b" (UID: "108ff4c3-fb90-4321-9fcb-90d3aeeab5bf") : configmap references non-existent config key: service-ca.crt Apr 16 08:35:16.025880 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:16.025871 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-metrics-certs podName:108ff4c3-fb90-4321-9fcb-90d3aeeab5bf nodeName:}" failed. No retries permitted until 2026-04-16 08:35:17.025863919 +0000 UTC m=+130.432902859 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-metrics-certs") pod "router-default-68d648d8c6-gdh8b" (UID: "108ff4c3-fb90-4321-9fcb-90d3aeeab5bf") : secret "router-metrics-certs-default" not found Apr 16 08:35:16.528367 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:16.528318 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" event={"ID":"e931af40-20ab-4a49-9b92-41dc1be13602","Type":"ContainerStarted","Data":"2eb00d88c2b243dec273c893beba250b62759df19d7fb7ea55d282a4ee2ea987"} Apr 16 08:35:17.035485 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:17.035443 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-service-ca-bundle\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:17.035664 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:17.035504 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-metrics-certs\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:17.035664 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:17.035533 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs\") pod \"network-metrics-daemon-vdxz4\" (UID: \"53e35ca8-ec77-48b7-8e96-ae73f7083c85\") " pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:35:17.035769 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:17.035675 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-service-ca-bundle podName:108ff4c3-fb90-4321-9fcb-90d3aeeab5bf nodeName:}" failed. No retries permitted until 2026-04-16 08:35:19.035651193 +0000 UTC m=+132.442690138 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-service-ca-bundle") pod "router-default-68d648d8c6-gdh8b" (UID: "108ff4c3-fb90-4321-9fcb-90d3aeeab5bf") : configmap references non-existent config key: service-ca.crt Apr 16 08:35:17.035769 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:17.035682 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 08:35:17.035769 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:17.035682 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 08:35:17.035769 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:17.035753 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-metrics-certs podName:108ff4c3-fb90-4321-9fcb-90d3aeeab5bf nodeName:}" failed. No retries permitted until 2026-04-16 08:35:19.035736399 +0000 UTC m=+132.442775343 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-metrics-certs") pod "router-default-68d648d8c6-gdh8b" (UID: "108ff4c3-fb90-4321-9fcb-90d3aeeab5bf") : secret "router-metrics-certs-default" not found Apr 16 08:35:17.036009 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:17.035814 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs podName:53e35ca8-ec77-48b7-8e96-ae73f7083c85 nodeName:}" failed. No retries permitted until 2026-04-16 08:37:19.035795482 +0000 UTC m=+252.442834437 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs") pod "network-metrics-daemon-vdxz4" (UID: "53e35ca8-ec77-48b7-8e96-ae73f7083c85") : secret "metrics-daemon-secret" not found Apr 16 08:35:18.533501 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:18.533474 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/0.log" Apr 16 08:35:18.533873 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:18.533514 2579 generic.go:358] "Generic (PLEG): container finished" podID="e931af40-20ab-4a49-9b92-41dc1be13602" containerID="80facfacb9e98460f9c412efc25c7af2b23ed6e55bd48a1c9df25c0ab29baff7" exitCode=255 Apr 16 08:35:18.533873 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:18.533562 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" event={"ID":"e931af40-20ab-4a49-9b92-41dc1be13602","Type":"ContainerDied","Data":"80facfacb9e98460f9c412efc25c7af2b23ed6e55bd48a1c9df25c0ab29baff7"} Apr 16 08:35:18.533873 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:18.533769 2579 scope.go:117] "RemoveContainer" containerID="80facfacb9e98460f9c412efc25c7af2b23ed6e55bd48a1c9df25c0ab29baff7" Apr 16 08:35:19.051674 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:19.051623 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-metrics-certs\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:19.051858 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:19.051738 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-service-ca-bundle\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:19.051858 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:19.051777 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 08:35:19.051858 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:19.051846 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-metrics-certs podName:108ff4c3-fb90-4321-9fcb-90d3aeeab5bf nodeName:}" failed. No retries permitted until 2026-04-16 08:35:23.051829856 +0000 UTC m=+136.458868797 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-metrics-certs") pod "router-default-68d648d8c6-gdh8b" (UID: "108ff4c3-fb90-4321-9fcb-90d3aeeab5bf") : secret "router-metrics-certs-default" not found Apr 16 08:35:19.051986 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:19.051861 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-service-ca-bundle podName:108ff4c3-fb90-4321-9fcb-90d3aeeab5bf nodeName:}" failed. No retries permitted until 2026-04-16 08:35:23.051855056 +0000 UTC m=+136.458893997 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-service-ca-bundle") pod "router-default-68d648d8c6-gdh8b" (UID: "108ff4c3-fb90-4321-9fcb-90d3aeeab5bf") : configmap references non-existent config key: service-ca.crt Apr 16 08:35:19.536990 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:19.536962 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/1.log" Apr 16 08:35:19.537374 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:19.537321 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/0.log" Apr 16 08:35:19.537374 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:19.537352 2579 generic.go:358] "Generic (PLEG): container finished" podID="e931af40-20ab-4a49-9b92-41dc1be13602" containerID="24a5565b093a48323cdfb683bdebfa8e6a966aa1674de3e52e81d269fdab9d34" exitCode=255 Apr 16 08:35:19.537447 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:19.537383 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" event={"ID":"e931af40-20ab-4a49-9b92-41dc1be13602","Type":"ContainerDied","Data":"24a5565b093a48323cdfb683bdebfa8e6a966aa1674de3e52e81d269fdab9d34"} Apr 16 08:35:19.537447 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:19.537424 2579 scope.go:117] "RemoveContainer" containerID="80facfacb9e98460f9c412efc25c7af2b23ed6e55bd48a1c9df25c0ab29baff7" Apr 16 08:35:19.537645 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:19.537630 2579 scope.go:117] "RemoveContainer" containerID="24a5565b093a48323cdfb683bdebfa8e6a966aa1674de3e52e81d269fdab9d34" Apr 16 08:35:19.537826 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:19.537809 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-5dsf6_openshift-console-operator(e931af40-20ab-4a49-9b92-41dc1be13602)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" podUID="e931af40-20ab-4a49-9b92-41dc1be13602" Apr 16 08:35:20.540769 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:20.540742 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/1.log" Apr 16 08:35:20.541160 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:20.541074 2579 scope.go:117] "RemoveContainer" containerID="24a5565b093a48323cdfb683bdebfa8e6a966aa1674de3e52e81d269fdab9d34" Apr 16 08:35:20.541243 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:20.541226 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-5dsf6_openshift-console-operator(e931af40-20ab-4a49-9b92-41dc1be13602)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" podUID="e931af40-20ab-4a49-9b92-41dc1be13602" Apr 16 08:35:21.407648 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:21.407611 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-fs7cw"] Apr 16 08:35:21.411127 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:21.411111 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-fs7cw" Apr 16 08:35:21.413966 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:21.413941 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 08:35:21.413966 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:21.413955 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 08:35:21.415086 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:21.415070 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-zhrhv\"" Apr 16 08:35:21.420818 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:21.420798 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-fs7cw"] Apr 16 08:35:21.570469 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:21.570436 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9f6z\" (UniqueName: \"kubernetes.io/projected/88ff5b6d-15ea-4646-8401-4e76a7424f8e-kube-api-access-m9f6z\") pod \"migrator-64d4d94569-fs7cw\" (UID: \"88ff5b6d-15ea-4646-8401-4e76a7424f8e\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-fs7cw" Apr 16 08:35:21.658486 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:21.658416 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6fwv8_9ab7948f-df7f-4fae-ad8e-a2cab21c427e/dns-node-resolver/0.log" Apr 16 08:35:21.671719 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:21.671694 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9f6z\" (UniqueName: \"kubernetes.io/projected/88ff5b6d-15ea-4646-8401-4e76a7424f8e-kube-api-access-m9f6z\") pod \"migrator-64d4d94569-fs7cw\" (UID: \"88ff5b6d-15ea-4646-8401-4e76a7424f8e\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-fs7cw" Apr 16 08:35:21.681110 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:21.681087 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9f6z\" (UniqueName: \"kubernetes.io/projected/88ff5b6d-15ea-4646-8401-4e76a7424f8e-kube-api-access-m9f6z\") pod \"migrator-64d4d94569-fs7cw\" (UID: \"88ff5b6d-15ea-4646-8401-4e76a7424f8e\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-fs7cw" Apr 16 08:35:21.719950 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:21.719928 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-fs7cw" Apr 16 08:35:21.839569 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:21.839540 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-fs7cw"] Apr 16 08:35:21.842271 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:35:21.842248 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88ff5b6d_15ea_4646_8401_4e76a7424f8e.slice/crio-378edfdbef26f80e2fec3bea43688968397310e3afdba0f07ba2909b14ed6166 WatchSource:0}: Error finding container 378edfdbef26f80e2fec3bea43688968397310e3afdba0f07ba2909b14ed6166: Status 404 returned error can't find the container with id 378edfdbef26f80e2fec3bea43688968397310e3afdba0f07ba2909b14ed6166 Apr 16 08:35:22.546694 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:22.546632 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-fs7cw" event={"ID":"88ff5b6d-15ea-4646-8401-4e76a7424f8e","Type":"ContainerStarted","Data":"378edfdbef26f80e2fec3bea43688968397310e3afdba0f07ba2909b14ed6166"} Apr 16 08:35:22.858537 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:22.858508 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hsx9c_2dd4e090-3d55-499c-a3fb-7a04e930e31c/node-ca/0.log" Apr 16 08:35:23.081581 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:23.081553 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-service-ca-bundle\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:23.081686 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:23.081589 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-metrics-certs\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:23.081730 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:23.081709 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-service-ca-bundle podName:108ff4c3-fb90-4321-9fcb-90d3aeeab5bf nodeName:}" failed. No retries permitted until 2026-04-16 08:35:31.081688739 +0000 UTC m=+144.488727683 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-service-ca-bundle") pod "router-default-68d648d8c6-gdh8b" (UID: "108ff4c3-fb90-4321-9fcb-90d3aeeab5bf") : configmap references non-existent config key: service-ca.crt Apr 16 08:35:23.081730 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:23.081709 2579 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 08:35:23.081809 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:23.081748 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-metrics-certs podName:108ff4c3-fb90-4321-9fcb-90d3aeeab5bf nodeName:}" failed. No retries permitted until 2026-04-16 08:35:31.081738217 +0000 UTC m=+144.488777158 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-metrics-certs") pod "router-default-68d648d8c6-gdh8b" (UID: "108ff4c3-fb90-4321-9fcb-90d3aeeab5bf") : secret "router-metrics-certs-default" not found Apr 16 08:35:23.550226 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:23.550187 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-fs7cw" event={"ID":"88ff5b6d-15ea-4646-8401-4e76a7424f8e","Type":"ContainerStarted","Data":"702d295bec644298eba69112adb1b16b263ebc311d05bd54371db66fb145e3d4"} Apr 16 08:35:23.550226 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:23.550220 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-fs7cw" event={"ID":"88ff5b6d-15ea-4646-8401-4e76a7424f8e","Type":"ContainerStarted","Data":"9266db64ce96802912a5439ab43ac587fd0b5f8ed0f25f6cb86dea3156cb7092"} Apr 16 08:35:23.570192 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:23.570149 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-fs7cw" podStartSLOduration=1.494021573 podStartE2EDuration="2.570135531s" podCreationTimestamp="2026-04-16 08:35:21 +0000 UTC" firstStartedPulling="2026-04-16 08:35:21.844004862 +0000 UTC m=+135.251043802" lastFinishedPulling="2026-04-16 08:35:22.920118817 +0000 UTC m=+136.327157760" observedRunningTime="2026-04-16 08:35:23.568714398 +0000 UTC m=+136.975753385" watchObservedRunningTime="2026-04-16 08:35:23.570135531 +0000 UTC m=+136.977174494" Apr 16 08:35:25.592808 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:25.592772 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" Apr 16 08:35:25.592808 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:25.592808 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" Apr 16 08:35:25.593228 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:25.593154 2579 scope.go:117] "RemoveContainer" containerID="24a5565b093a48323cdfb683bdebfa8e6a966aa1674de3e52e81d269fdab9d34" Apr 16 08:35:25.593318 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:25.593301 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-5dsf6_openshift-console-operator(e931af40-20ab-4a49-9b92-41dc1be13602)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" podUID="e931af40-20ab-4a49-9b92-41dc1be13602" Apr 16 08:35:31.141289 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:31.141243 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-service-ca-bundle\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:31.141289 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:31.141295 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-metrics-certs\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:31.141719 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:31.141411 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-service-ca-bundle podName:108ff4c3-fb90-4321-9fcb-90d3aeeab5bf nodeName:}" failed. No retries permitted until 2026-04-16 08:35:47.141391073 +0000 UTC m=+160.548430013 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-service-ca-bundle") pod "router-default-68d648d8c6-gdh8b" (UID: "108ff4c3-fb90-4321-9fcb-90d3aeeab5bf") : configmap references non-existent config key: service-ca.crt Apr 16 08:35:31.143705 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:31.143689 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-metrics-certs\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:40.185464 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:40.185431 2579 scope.go:117] "RemoveContainer" containerID="24a5565b093a48323cdfb683bdebfa8e6a966aa1674de3e52e81d269fdab9d34" Apr 16 08:35:40.586696 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:40.586672 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/1.log" Apr 16 08:35:40.586850 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:40.586724 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" event={"ID":"e931af40-20ab-4a49-9b92-41dc1be13602","Type":"ContainerStarted","Data":"069cdba94519c76682ed9aa9e816358424f70ba3be6ba9c64dd9fab91b4c1304"} Apr 16 08:35:40.587029 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:40.587010 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" Apr 16 08:35:40.605273 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:40.605230 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" podStartSLOduration=23.361201958 podStartE2EDuration="25.605220357s" podCreationTimestamp="2026-04-16 08:35:15 +0000 UTC" firstStartedPulling="2026-04-16 08:35:15.717464946 +0000 UTC m=+129.124503889" lastFinishedPulling="2026-04-16 08:35:17.961483344 +0000 UTC m=+131.368522288" observedRunningTime="2026-04-16 08:35:40.604686237 +0000 UTC m=+154.011725201" watchObservedRunningTime="2026-04-16 08:35:40.605220357 +0000 UTC m=+154.012259316" Apr 16 08:35:40.797244 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:40.797217 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-5dsf6" Apr 16 08:35:43.002789 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:43.002749 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-flzpm" podUID="7025f860-4946-43aa-9ebe-7d45f5616858" Apr 16 08:35:43.007867 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:43.007845 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-b9rrh" podUID="06b0854a-ae72-4622-b5bd-6803c4a6d119" Apr 16 08:35:43.208353 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:35:43.208320 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-vdxz4" podUID="53e35ca8-ec77-48b7-8e96-ae73f7083c85" Apr 16 08:35:43.593529 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.593486 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b9rrh" Apr 16 08:35:43.593698 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.593569 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-flzpm" Apr 16 08:35:43.630999 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.630478 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-7cdb7"] Apr 16 08:35:43.633975 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.633951 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7cdb7" Apr 16 08:35:43.645507 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.645483 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rqvkt\"" Apr 16 08:35:43.645598 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.645515 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 08:35:43.645598 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.645491 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 08:35:43.645598 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.645515 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 08:35:43.646151 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.646136 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 08:35:43.663829 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.663809 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7cdb7"] Apr 16 08:35:43.736245 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.736211 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-94967b795-42p2c"] Apr 16 08:35:43.736392 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.736357 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b9604c68-8fac-4d6c-bd52-7623732bd202-data-volume\") pod \"insights-runtime-extractor-7cdb7\" (UID: \"b9604c68-8fac-4d6c-bd52-7623732bd202\") " pod="openshift-insights/insights-runtime-extractor-7cdb7" Apr 16 08:35:43.736431 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.736399 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b9604c68-8fac-4d6c-bd52-7623732bd202-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7cdb7\" (UID: \"b9604c68-8fac-4d6c-bd52-7623732bd202\") " pod="openshift-insights/insights-runtime-extractor-7cdb7" Apr 16 08:35:43.736461 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.736434 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b9604c68-8fac-4d6c-bd52-7623732bd202-crio-socket\") pod \"insights-runtime-extractor-7cdb7\" (UID: \"b9604c68-8fac-4d6c-bd52-7623732bd202\") " pod="openshift-insights/insights-runtime-extractor-7cdb7" Apr 16 08:35:43.736528 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.736505 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b9604c68-8fac-4d6c-bd52-7623732bd202-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7cdb7\" (UID: \"b9604c68-8fac-4d6c-bd52-7623732bd202\") " pod="openshift-insights/insights-runtime-extractor-7cdb7" Apr 16 08:35:43.736563 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.736542 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mzqm\" (UniqueName: \"kubernetes.io/projected/b9604c68-8fac-4d6c-bd52-7623732bd202-kube-api-access-5mzqm\") pod \"insights-runtime-extractor-7cdb7\" (UID: \"b9604c68-8fac-4d6c-bd52-7623732bd202\") " pod="openshift-insights/insights-runtime-extractor-7cdb7" Apr 16 08:35:43.739031 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.739017 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.742009 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.741945 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 08:35:43.742009 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.741965 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dk699\"" Apr 16 08:35:43.742342 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.742119 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 08:35:43.742342 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.742327 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 08:35:43.746768 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.746747 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 08:35:43.759697 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.759672 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-94967b795-42p2c"] Apr 16 08:35:43.837469 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.837438 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b9604c68-8fac-4d6c-bd52-7623732bd202-crio-socket\") pod \"insights-runtime-extractor-7cdb7\" (UID: \"b9604c68-8fac-4d6c-bd52-7623732bd202\") " pod="openshift-insights/insights-runtime-extractor-7cdb7" Apr 16 08:35:43.837625 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.837475 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db8c63e6-b841-42a4-8465-5a51f82d5a40-bound-sa-token\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.837625 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.837499 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db8c63e6-b841-42a4-8465-5a51f82d5a40-trusted-ca\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.837625 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.837519 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mzqm\" (UniqueName: \"kubernetes.io/projected/b9604c68-8fac-4d6c-bd52-7623732bd202-kube-api-access-5mzqm\") pod \"insights-runtime-extractor-7cdb7\" (UID: \"b9604c68-8fac-4d6c-bd52-7623732bd202\") " pod="openshift-insights/insights-runtime-extractor-7cdb7" Apr 16 08:35:43.837625 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.837557 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b9604c68-8fac-4d6c-bd52-7623732bd202-crio-socket\") pod \"insights-runtime-extractor-7cdb7\" (UID: \"b9604c68-8fac-4d6c-bd52-7623732bd202\") " pod="openshift-insights/insights-runtime-extractor-7cdb7" Apr 16 08:35:43.837625 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.837561 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnx2c\" (UniqueName: \"kubernetes.io/projected/db8c63e6-b841-42a4-8465-5a51f82d5a40-kube-api-access-vnx2c\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.837625 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.837603 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/db8c63e6-b841-42a4-8465-5a51f82d5a40-ca-trust-extracted\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.837805 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.837632 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b9604c68-8fac-4d6c-bd52-7623732bd202-data-volume\") pod \"insights-runtime-extractor-7cdb7\" (UID: \"b9604c68-8fac-4d6c-bd52-7623732bd202\") " pod="openshift-insights/insights-runtime-extractor-7cdb7" Apr 16 08:35:43.837805 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.837664 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b9604c68-8fac-4d6c-bd52-7623732bd202-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7cdb7\" (UID: \"b9604c68-8fac-4d6c-bd52-7623732bd202\") " pod="openshift-insights/insights-runtime-extractor-7cdb7" Apr 16 08:35:43.837805 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.837695 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/db8c63e6-b841-42a4-8465-5a51f82d5a40-image-registry-private-configuration\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.837805 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.837752 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b9604c68-8fac-4d6c-bd52-7623732bd202-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7cdb7\" (UID: \"b9604c68-8fac-4d6c-bd52-7623732bd202\") " pod="openshift-insights/insights-runtime-extractor-7cdb7" Apr 16 08:35:43.837805 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.837781 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/db8c63e6-b841-42a4-8465-5a51f82d5a40-registry-certificates\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.837984 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.837817 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/db8c63e6-b841-42a4-8465-5a51f82d5a40-registry-tls\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.837984 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.837837 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/db8c63e6-b841-42a4-8465-5a51f82d5a40-installation-pull-secrets\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.838073 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.838053 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b9604c68-8fac-4d6c-bd52-7623732bd202-data-volume\") pod \"insights-runtime-extractor-7cdb7\" (UID: \"b9604c68-8fac-4d6c-bd52-7623732bd202\") " pod="openshift-insights/insights-runtime-extractor-7cdb7" Apr 16 08:35:43.838248 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.838232 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b9604c68-8fac-4d6c-bd52-7623732bd202-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7cdb7\" (UID: \"b9604c68-8fac-4d6c-bd52-7623732bd202\") " pod="openshift-insights/insights-runtime-extractor-7cdb7" Apr 16 08:35:43.840159 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.840141 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b9604c68-8fac-4d6c-bd52-7623732bd202-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7cdb7\" (UID: \"b9604c68-8fac-4d6c-bd52-7623732bd202\") " pod="openshift-insights/insights-runtime-extractor-7cdb7" Apr 16 08:35:43.846613 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.846560 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mzqm\" (UniqueName: \"kubernetes.io/projected/b9604c68-8fac-4d6c-bd52-7623732bd202-kube-api-access-5mzqm\") pod \"insights-runtime-extractor-7cdb7\" (UID: \"b9604c68-8fac-4d6c-bd52-7623732bd202\") " pod="openshift-insights/insights-runtime-extractor-7cdb7" Apr 16 08:35:43.938598 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.938566 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnx2c\" (UniqueName: \"kubernetes.io/projected/db8c63e6-b841-42a4-8465-5a51f82d5a40-kube-api-access-vnx2c\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.938775 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.938605 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/db8c63e6-b841-42a4-8465-5a51f82d5a40-ca-trust-extracted\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.938850 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.938826 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/db8c63e6-b841-42a4-8465-5a51f82d5a40-image-registry-private-configuration\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.938941 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.938925 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/db8c63e6-b841-42a4-8465-5a51f82d5a40-registry-certificates\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.939003 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.938988 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/db8c63e6-b841-42a4-8465-5a51f82d5a40-registry-tls\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.939053 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.939010 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/db8c63e6-b841-42a4-8465-5a51f82d5a40-ca-trust-extracted\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.939053 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.939015 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/db8c63e6-b841-42a4-8465-5a51f82d5a40-installation-pull-secrets\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.939153 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.939088 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db8c63e6-b841-42a4-8465-5a51f82d5a40-bound-sa-token\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.939153 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.939121 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db8c63e6-b841-42a4-8465-5a51f82d5a40-trusted-ca\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.939816 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.939714 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/db8c63e6-b841-42a4-8465-5a51f82d5a40-registry-certificates\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.939816 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.939812 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/db8c63e6-b841-42a4-8465-5a51f82d5a40-trusted-ca\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.941554 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.941530 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/db8c63e6-b841-42a4-8465-5a51f82d5a40-installation-pull-secrets\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.941991 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.941970 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/db8c63e6-b841-42a4-8465-5a51f82d5a40-image-registry-private-configuration\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.942110 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.942008 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/db8c63e6-b841-42a4-8465-5a51f82d5a40-registry-tls\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.942760 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.942745 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7cdb7" Apr 16 08:35:43.970176 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.970144 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/db8c63e6-b841-42a4-8465-5a51f82d5a40-bound-sa-token\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:43.970387 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:43.970366 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnx2c\" (UniqueName: \"kubernetes.io/projected/db8c63e6-b841-42a4-8465-5a51f82d5a40-kube-api-access-vnx2c\") pod \"image-registry-94967b795-42p2c\" (UID: \"db8c63e6-b841-42a4-8465-5a51f82d5a40\") " pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:44.048398 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:44.048367 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:44.059578 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:44.059551 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7cdb7"] Apr 16 08:35:44.062824 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:35:44.062797 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9604c68_8fac_4d6c_bd52_7623732bd202.slice/crio-d79b9ba5c9f5b95fbaf4bbf92ab351c2eba6e9c4fee45c5bdb6a687f3365d6be WatchSource:0}: Error finding container d79b9ba5c9f5b95fbaf4bbf92ab351c2eba6e9c4fee45c5bdb6a687f3365d6be: Status 404 returned error can't find the container with id d79b9ba5c9f5b95fbaf4bbf92ab351c2eba6e9c4fee45c5bdb6a687f3365d6be Apr 16 08:35:44.171236 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:44.171206 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-94967b795-42p2c"] Apr 16 08:35:44.174591 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:35:44.174564 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb8c63e6_b841_42a4_8465_5a51f82d5a40.slice/crio-fceee7ecb42db828b3c54702ffa68ac3922b55ce4b399371a88af9545f90af95 WatchSource:0}: Error finding container fceee7ecb42db828b3c54702ffa68ac3922b55ce4b399371a88af9545f90af95: Status 404 returned error can't find the container with id fceee7ecb42db828b3c54702ffa68ac3922b55ce4b399371a88af9545f90af95 Apr 16 08:35:44.597872 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:44.597834 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7cdb7" event={"ID":"b9604c68-8fac-4d6c-bd52-7623732bd202","Type":"ContainerStarted","Data":"b9b48e2c2b64adfd1f8a2fdb23f86a6d010c689147a335c0f3cf2a141b3c345e"} Apr 16 08:35:44.597872 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:44.597878 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7cdb7" event={"ID":"b9604c68-8fac-4d6c-bd52-7623732bd202","Type":"ContainerStarted","Data":"d79b9ba5c9f5b95fbaf4bbf92ab351c2eba6e9c4fee45c5bdb6a687f3365d6be"} Apr 16 08:35:44.599284 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:44.599251 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-94967b795-42p2c" event={"ID":"db8c63e6-b841-42a4-8465-5a51f82d5a40","Type":"ContainerStarted","Data":"6bdda536f8a0a7e4875d0885c7ed61d3e3cfc9bd4f313be04c954f632e21f1cc"} Apr 16 08:35:44.599411 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:44.599288 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-94967b795-42p2c" event={"ID":"db8c63e6-b841-42a4-8465-5a51f82d5a40","Type":"ContainerStarted","Data":"fceee7ecb42db828b3c54702ffa68ac3922b55ce4b399371a88af9545f90af95"} Apr 16 08:35:44.599482 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:44.599465 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:35:44.620853 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:44.620757 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-94967b795-42p2c" podStartSLOduration=1.620735996 podStartE2EDuration="1.620735996s" podCreationTimestamp="2026-04-16 08:35:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:35:44.6192423 +0000 UTC m=+158.026281277" watchObservedRunningTime="2026-04-16 08:35:44.620735996 +0000 UTC m=+158.027774960" Apr 16 08:35:45.603709 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:45.603667 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7cdb7" event={"ID":"b9604c68-8fac-4d6c-bd52-7623732bd202","Type":"ContainerStarted","Data":"861809d9c1746c91ad3241ce3748afa732f6b5b524452a411d5bbe4b2c4d40c1"} Apr 16 08:35:46.608930 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:46.608874 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7cdb7" event={"ID":"b9604c68-8fac-4d6c-bd52-7623732bd202","Type":"ContainerStarted","Data":"ba3f61fddf0bca2229b6d899a38db6f65d3d81f4706ee1faaeddeebe2276b9bb"} Apr 16 08:35:46.628418 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:46.628371 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-7cdb7" podStartSLOduration=1.7258721110000002 podStartE2EDuration="3.628357177s" podCreationTimestamp="2026-04-16 08:35:43 +0000 UTC" firstStartedPulling="2026-04-16 08:35:44.128308227 +0000 UTC m=+157.535347171" lastFinishedPulling="2026-04-16 08:35:46.030793283 +0000 UTC m=+159.437832237" observedRunningTime="2026-04-16 08:35:46.627750817 +0000 UTC m=+160.034789779" watchObservedRunningTime="2026-04-16 08:35:46.628357177 +0000 UTC m=+160.035396223" Apr 16 08:35:47.165454 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:47.165409 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-service-ca-bundle\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:47.166019 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:47.166001 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/108ff4c3-fb90-4321-9fcb-90d3aeeab5bf-service-ca-bundle\") pod \"router-default-68d648d8c6-gdh8b\" (UID: \"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf\") " pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:47.386770 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:47.386727 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:47.518172 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:47.518141 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-68d648d8c6-gdh8b"] Apr 16 08:35:47.521342 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:35:47.521310 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod108ff4c3_fb90_4321_9fcb_90d3aeeab5bf.slice/crio-02471956d7980f7b1e8e07dfb3361d201c53b1de07789a21bfe10d717fe829d7 WatchSource:0}: Error finding container 02471956d7980f7b1e8e07dfb3361d201c53b1de07789a21bfe10d717fe829d7: Status 404 returned error can't find the container with id 02471956d7980f7b1e8e07dfb3361d201c53b1de07789a21bfe10d717fe829d7 Apr 16 08:35:47.611946 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:47.611916 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-68d648d8c6-gdh8b" event={"ID":"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf","Type":"ContainerStarted","Data":"a4d0dcdf71678486227ab5401888a0b23fe704e7196008d144c5da7496983ef3"} Apr 16 08:35:47.612339 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:47.611951 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-68d648d8c6-gdh8b" event={"ID":"108ff4c3-fb90-4321-9fcb-90d3aeeab5bf","Type":"ContainerStarted","Data":"02471956d7980f7b1e8e07dfb3361d201c53b1de07789a21bfe10d717fe829d7"} Apr 16 08:35:47.636182 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:47.636140 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-68d648d8c6-gdh8b" podStartSLOduration=32.636127501 podStartE2EDuration="32.636127501s" podCreationTimestamp="2026-04-16 08:35:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:35:47.635037178 +0000 UTC m=+161.042076166" watchObservedRunningTime="2026-04-16 08:35:47.636127501 +0000 UTC m=+161.043166463" Apr 16 08:35:47.869689 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:47.869654 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls\") pod \"dns-default-flzpm\" (UID: \"7025f860-4946-43aa-9ebe-7d45f5616858\") " pod="openshift-dns/dns-default-flzpm" Apr 16 08:35:47.871987 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:47.871956 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7025f860-4946-43aa-9ebe-7d45f5616858-metrics-tls\") pod \"dns-default-flzpm\" (UID: \"7025f860-4946-43aa-9ebe-7d45f5616858\") " pod="openshift-dns/dns-default-flzpm" Apr 16 08:35:47.970709 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:47.970674 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert\") pod \"ingress-canary-b9rrh\" (UID: \"06b0854a-ae72-4622-b5bd-6803c4a6d119\") " pod="openshift-ingress-canary/ingress-canary-b9rrh" Apr 16 08:35:47.973147 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:47.973123 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06b0854a-ae72-4622-b5bd-6803c4a6d119-cert\") pod \"ingress-canary-b9rrh\" (UID: \"06b0854a-ae72-4622-b5bd-6803c4a6d119\") " pod="openshift-ingress-canary/ingress-canary-b9rrh" Apr 16 08:35:48.096570 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:48.096545 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-l2kgt\"" Apr 16 08:35:48.097523 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:48.097496 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kp7vt\"" Apr 16 08:35:48.104385 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:48.104371 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b9rrh" Apr 16 08:35:48.104446 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:48.104395 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-flzpm" Apr 16 08:35:48.229377 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:48.229343 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-flzpm"] Apr 16 08:35:48.231648 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:35:48.231624 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7025f860_4946_43aa_9ebe_7d45f5616858.slice/crio-97c7d2bf1a912a0009897ca2e64db3132ed2e8dd3938aa258baf056d35aadcfe WatchSource:0}: Error finding container 97c7d2bf1a912a0009897ca2e64db3132ed2e8dd3938aa258baf056d35aadcfe: Status 404 returned error can't find the container with id 97c7d2bf1a912a0009897ca2e64db3132ed2e8dd3938aa258baf056d35aadcfe Apr 16 08:35:48.246087 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:48.246064 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b9rrh"] Apr 16 08:35:48.248638 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:35:48.248609 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06b0854a_ae72_4622_b5bd_6803c4a6d119.slice/crio-1c6976adc040ccf32206ff294111191c18d129365f7a80b6cd13f15e82bd7c69 WatchSource:0}: Error finding container 1c6976adc040ccf32206ff294111191c18d129365f7a80b6cd13f15e82bd7c69: Status 404 returned error can't find the container with id 1c6976adc040ccf32206ff294111191c18d129365f7a80b6cd13f15e82bd7c69 Apr 16 08:35:48.387117 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:48.387043 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:48.389662 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:48.389637 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:48.616566 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:48.616525 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-flzpm" event={"ID":"7025f860-4946-43aa-9ebe-7d45f5616858","Type":"ContainerStarted","Data":"97c7d2bf1a912a0009897ca2e64db3132ed2e8dd3938aa258baf056d35aadcfe"} Apr 16 08:35:48.618211 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:48.618170 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b9rrh" event={"ID":"06b0854a-ae72-4622-b5bd-6803c4a6d119","Type":"ContainerStarted","Data":"1c6976adc040ccf32206ff294111191c18d129365f7a80b6cd13f15e82bd7c69"} Apr 16 08:35:48.618751 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:48.618717 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:48.620245 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:48.620063 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-68d648d8c6-gdh8b" Apr 16 08:35:49.526795 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:49.526765 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh5jv"] Apr 16 08:35:49.529989 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:49.529958 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh5jv" Apr 16 08:35:49.532621 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:49.532592 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-t56lp\"" Apr 16 08:35:49.532747 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:49.532631 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 08:35:49.540510 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:49.540489 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh5jv"] Apr 16 08:35:49.582475 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:49.582447 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a3aebdf4-6bc4-473a-8c68-006e1dc30678-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-wh5jv\" (UID: \"a3aebdf4-6bc4-473a-8c68-006e1dc30678\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh5jv" Apr 16 08:35:49.683980 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:49.683945 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a3aebdf4-6bc4-473a-8c68-006e1dc30678-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-wh5jv\" (UID: \"a3aebdf4-6bc4-473a-8c68-006e1dc30678\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh5jv" Apr 16 08:35:49.687062 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:49.687035 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a3aebdf4-6bc4-473a-8c68-006e1dc30678-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-wh5jv\" (UID: \"a3aebdf4-6bc4-473a-8c68-006e1dc30678\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh5jv" Apr 16 08:35:49.841261 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:49.841230 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh5jv" Apr 16 08:35:50.115247 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:50.115190 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh5jv"] Apr 16 08:35:50.118667 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:35:50.118640 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3aebdf4_6bc4_473a_8c68_006e1dc30678.slice/crio-44afa58a3c64eac5565d79d3649424419d4e715e04b0802e9fc8e4994b2a718a WatchSource:0}: Error finding container 44afa58a3c64eac5565d79d3649424419d4e715e04b0802e9fc8e4994b2a718a: Status 404 returned error can't find the container with id 44afa58a3c64eac5565d79d3649424419d4e715e04b0802e9fc8e4994b2a718a Apr 16 08:35:50.624912 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:50.624832 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh5jv" event={"ID":"a3aebdf4-6bc4-473a-8c68-006e1dc30678","Type":"ContainerStarted","Data":"44afa58a3c64eac5565d79d3649424419d4e715e04b0802e9fc8e4994b2a718a"} Apr 16 08:35:50.626781 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:50.626738 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-flzpm" event={"ID":"7025f860-4946-43aa-9ebe-7d45f5616858","Type":"ContainerStarted","Data":"588c7b9ca2dbf8d4a6faec54b1615cef4ecc1ef7f68bed666685027076ba231c"} Apr 16 08:35:50.626781 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:50.626777 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-flzpm" event={"ID":"7025f860-4946-43aa-9ebe-7d45f5616858","Type":"ContainerStarted","Data":"7fea062ae33db2fd99afbf6cae9d9d423efefc9d345ddf2dc618e853e1f58357"} Apr 16 08:35:50.627074 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:50.626884 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-flzpm" Apr 16 08:35:50.628492 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:50.628467 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b9rrh" event={"ID":"06b0854a-ae72-4622-b5bd-6803c4a6d119","Type":"ContainerStarted","Data":"65331303e8a1025b97955f6ac453d058a35ea0038ac63de775fed619f2fce5c1"} Apr 16 08:35:50.649598 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:50.649560 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-flzpm" podStartSLOduration=129.89934373 podStartE2EDuration="2m11.649548652s" podCreationTimestamp="2026-04-16 08:33:39 +0000 UTC" firstStartedPulling="2026-04-16 08:35:48.233484736 +0000 UTC m=+161.640523677" lastFinishedPulling="2026-04-16 08:35:49.983689659 +0000 UTC m=+163.390728599" observedRunningTime="2026-04-16 08:35:50.648684759 +0000 UTC m=+164.055723723" watchObservedRunningTime="2026-04-16 08:35:50.649548652 +0000 UTC m=+164.056587614" Apr 16 08:35:50.666472 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:50.666421 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-b9rrh" podStartSLOduration=129.929234632 podStartE2EDuration="2m11.666407481s" podCreationTimestamp="2026-04-16 08:33:39 +0000 UTC" firstStartedPulling="2026-04-16 08:35:48.250288 +0000 UTC m=+161.657326941" lastFinishedPulling="2026-04-16 08:35:49.987460845 +0000 UTC m=+163.394499790" observedRunningTime="2026-04-16 08:35:50.665617677 +0000 UTC m=+164.072656639" watchObservedRunningTime="2026-04-16 08:35:50.666407481 +0000 UTC m=+164.073446445" Apr 16 08:35:51.634716 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:51.634677 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh5jv" event={"ID":"a3aebdf4-6bc4-473a-8c68-006e1dc30678","Type":"ContainerStarted","Data":"15b38e64f141bd63a1931da1b29e2ae8d93267a52fd41496bb69b417c6fe5f1f"} Apr 16 08:35:51.652121 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:51.652079 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh5jv" podStartSLOduration=1.675627795 podStartE2EDuration="2.652065214s" podCreationTimestamp="2026-04-16 08:35:49 +0000 UTC" firstStartedPulling="2026-04-16 08:35:50.12030683 +0000 UTC m=+163.527345774" lastFinishedPulling="2026-04-16 08:35:51.096744239 +0000 UTC m=+164.503783193" observedRunningTime="2026-04-16 08:35:51.651151228 +0000 UTC m=+165.058190190" watchObservedRunningTime="2026-04-16 08:35:51.652065214 +0000 UTC m=+165.059104179" Apr 16 08:35:52.637752 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:52.637718 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh5jv" Apr 16 08:35:52.642916 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:52.642876 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-wh5jv" Apr 16 08:35:57.185823 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:35:57.185794 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:36:00.637101 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:00.637068 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-flzpm" Apr 16 08:36:04.984321 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:04.984288 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ffnjx"] Apr 16 08:36:04.988963 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:04.988946 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:04.993929 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:04.993884 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 08:36:04.994058 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:04.993984 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-ggsvg\"" Apr 16 08:36:04.994119 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:04.994098 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 08:36:04.994707 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:04.994687 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 08:36:04.994863 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:04.994846 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 08:36:04.994946 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:04.994909 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 08:36:04.994996 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:04.994848 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 08:36:05.094332 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.094293 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-node-exporter-tls\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.094524 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.094338 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-sys\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.094524 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.094367 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-metrics-client-ca\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.094524 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.094447 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.094524 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.094483 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-node-exporter-accelerators-collector-config\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.094524 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.094508 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-node-exporter-wtmp\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.094739 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.094593 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-root\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.094739 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.094651 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65hv9\" (UniqueName: \"kubernetes.io/projected/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-kube-api-access-65hv9\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.094739 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.094701 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-node-exporter-textfile\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.195931 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.195881 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65hv9\" (UniqueName: \"kubernetes.io/projected/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-kube-api-access-65hv9\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.196139 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.195940 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-node-exporter-textfile\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.196139 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.195976 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-node-exporter-tls\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.196139 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.195994 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-sys\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.196139 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.196011 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-metrics-client-ca\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.196139 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.196044 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.196139 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.196073 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-node-exporter-accelerators-collector-config\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.196139 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.196099 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-node-exporter-wtmp\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.196139 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:36:05.196110 2579 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 08:36:05.196139 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.196119 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-sys\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.196139 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.196139 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-root\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.196502 ip-10-0-130-41 kubenswrapper[2579]: E0416 08:36:05.196185 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-node-exporter-tls podName:8634d4e3-1dce-4678-b6d6-0e3e4f8118a1 nodeName:}" failed. No retries permitted until 2026-04-16 08:36:05.696167704 +0000 UTC m=+179.103206649 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-node-exporter-tls") pod "node-exporter-ffnjx" (UID: "8634d4e3-1dce-4678-b6d6-0e3e4f8118a1") : secret "node-exporter-tls" not found Apr 16 08:36:05.196502 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.196214 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-root\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.196502 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.196399 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-node-exporter-textfile\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.196502 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.196434 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-node-exporter-wtmp\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.196658 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.196642 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-metrics-client-ca\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.196720 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.196674 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-node-exporter-accelerators-collector-config\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.198822 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.198801 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.210493 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.210469 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65hv9\" (UniqueName: \"kubernetes.io/projected/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-kube-api-access-65hv9\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.608259 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.608228 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-94967b795-42p2c" Apr 16 08:36:05.700552 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.700516 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-node-exporter-tls\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.703128 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.703102 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8634d4e3-1dce-4678-b6d6-0e3e4f8118a1-node-exporter-tls\") pod \"node-exporter-ffnjx\" (UID: \"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1\") " pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.901349 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:05.901258 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ffnjx" Apr 16 08:36:05.910563 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:36:05.910532 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8634d4e3_1dce_4678_b6d6_0e3e4f8118a1.slice/crio-2d524e10c3b0aaf0de86253e02b3a0497c58b2b4a0aff38488af3d6f09c4fb40 WatchSource:0}: Error finding container 2d524e10c3b0aaf0de86253e02b3a0497c58b2b4a0aff38488af3d6f09c4fb40: Status 404 returned error can't find the container with id 2d524e10c3b0aaf0de86253e02b3a0497c58b2b4a0aff38488af3d6f09c4fb40 Apr 16 08:36:06.674655 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:06.674614 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ffnjx" event={"ID":"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1","Type":"ContainerStarted","Data":"2d524e10c3b0aaf0de86253e02b3a0497c58b2b4a0aff38488af3d6f09c4fb40"} Apr 16 08:36:07.678489 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:07.678454 2579 generic.go:358] "Generic (PLEG): container finished" podID="8634d4e3-1dce-4678-b6d6-0e3e4f8118a1" containerID="f75f6f27c37ee7571e70187fd407a4356ca1f51216dbaa6d715fe569e7543ad5" exitCode=0 Apr 16 08:36:07.678847 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:07.678505 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ffnjx" event={"ID":"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1","Type":"ContainerDied","Data":"f75f6f27c37ee7571e70187fd407a4356ca1f51216dbaa6d715fe569e7543ad5"} Apr 16 08:36:08.083058 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.083021 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-588b49b58c-x8xxw"] Apr 16 08:36:08.086511 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.086496 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.089221 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.089190 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 08:36:08.089392 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.089241 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 08:36:08.089392 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.089269 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 08:36:08.089392 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.089321 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 08:36:08.089392 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.089380 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-tthjd\"" Apr 16 08:36:08.089580 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.089471 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-e9smm1aec63d4\"" Apr 16 08:36:08.089580 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.089494 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 08:36:08.097929 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.097885 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-588b49b58c-x8xxw"] Apr 16 08:36:08.121044 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.121014 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/564b4b71-9646-4d83-bfa1-4eb02f22473a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.121177 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.121163 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/564b4b71-9646-4d83-bfa1-4eb02f22473a-secret-thanos-querier-tls\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.121245 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.121198 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/564b4b71-9646-4d83-bfa1-4eb02f22473a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.121245 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.121234 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/564b4b71-9646-4d83-bfa1-4eb02f22473a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.121342 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.121277 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/564b4b71-9646-4d83-bfa1-4eb02f22473a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.121342 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.121313 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/564b4b71-9646-4d83-bfa1-4eb02f22473a-metrics-client-ca\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.121446 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.121371 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5fkh\" (UniqueName: \"kubernetes.io/projected/564b4b71-9646-4d83-bfa1-4eb02f22473a-kube-api-access-l5fkh\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.121446 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.121398 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/564b4b71-9646-4d83-bfa1-4eb02f22473a-secret-grpc-tls\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.222103 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.222052 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/564b4b71-9646-4d83-bfa1-4eb02f22473a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.222103 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.222102 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/564b4b71-9646-4d83-bfa1-4eb02f22473a-metrics-client-ca\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.222348 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.222135 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5fkh\" (UniqueName: \"kubernetes.io/projected/564b4b71-9646-4d83-bfa1-4eb02f22473a-kube-api-access-l5fkh\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.222348 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.222152 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/564b4b71-9646-4d83-bfa1-4eb02f22473a-secret-grpc-tls\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.222348 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.222186 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/564b4b71-9646-4d83-bfa1-4eb02f22473a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.222348 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.222212 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/564b4b71-9646-4d83-bfa1-4eb02f22473a-secret-thanos-querier-tls\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.222348 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.222229 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/564b4b71-9646-4d83-bfa1-4eb02f22473a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.222348 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.222252 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/564b4b71-9646-4d83-bfa1-4eb02f22473a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.223257 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.223228 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/564b4b71-9646-4d83-bfa1-4eb02f22473a-metrics-client-ca\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.225094 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.225049 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/564b4b71-9646-4d83-bfa1-4eb02f22473a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.225240 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.225116 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/564b4b71-9646-4d83-bfa1-4eb02f22473a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.225426 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.225393 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/564b4b71-9646-4d83-bfa1-4eb02f22473a-secret-thanos-querier-tls\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.225581 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.225561 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/564b4b71-9646-4d83-bfa1-4eb02f22473a-secret-grpc-tls\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.225639 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.225612 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/564b4b71-9646-4d83-bfa1-4eb02f22473a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.225759 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.225737 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/564b4b71-9646-4d83-bfa1-4eb02f22473a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.231669 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.231647 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5fkh\" (UniqueName: \"kubernetes.io/projected/564b4b71-9646-4d83-bfa1-4eb02f22473a-kube-api-access-l5fkh\") pod \"thanos-querier-588b49b58c-x8xxw\" (UID: \"564b4b71-9646-4d83-bfa1-4eb02f22473a\") " pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.395561 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.395490 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:08.518461 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.518434 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-588b49b58c-x8xxw"] Apr 16 08:36:08.520337 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:36:08.520311 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod564b4b71_9646_4d83_bfa1_4eb02f22473a.slice/crio-460be782c2a855f31c5c1b16bdbdc19b2e012d8dd3cc5ba37e09ec6b719a491f WatchSource:0}: Error finding container 460be782c2a855f31c5c1b16bdbdc19b2e012d8dd3cc5ba37e09ec6b719a491f: Status 404 returned error can't find the container with id 460be782c2a855f31c5c1b16bdbdc19b2e012d8dd3cc5ba37e09ec6b719a491f Apr 16 08:36:08.683520 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.683426 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ffnjx" event={"ID":"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1","Type":"ContainerStarted","Data":"3452e33b08077d6129503a0e94a22f9fc522a32d118971aa18a357c75285c04a"} Apr 16 08:36:08.683520 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.683468 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ffnjx" event={"ID":"8634d4e3-1dce-4678-b6d6-0e3e4f8118a1","Type":"ContainerStarted","Data":"83f4b80a1f2b0e5109b9ecaabf8489ec622d78a1b9918cd7e9f2282a34d7e7b9"} Apr 16 08:36:08.684485 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.684466 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" event={"ID":"564b4b71-9646-4d83-bfa1-4eb02f22473a","Type":"ContainerStarted","Data":"460be782c2a855f31c5c1b16bdbdc19b2e012d8dd3cc5ba37e09ec6b719a491f"} Apr 16 08:36:08.703430 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:08.702806 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ffnjx" podStartSLOduration=3.85278988 podStartE2EDuration="4.702789987s" podCreationTimestamp="2026-04-16 08:36:04 +0000 UTC" firstStartedPulling="2026-04-16 08:36:05.912711284 +0000 UTC m=+179.319750242" lastFinishedPulling="2026-04-16 08:36:06.762711409 +0000 UTC m=+180.169750349" observedRunningTime="2026-04-16 08:36:08.702497841 +0000 UTC m=+182.109536805" watchObservedRunningTime="2026-04-16 08:36:08.702789987 +0000 UTC m=+182.109828951" Apr 16 08:36:10.693322 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:10.693290 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" event={"ID":"564b4b71-9646-4d83-bfa1-4eb02f22473a","Type":"ContainerStarted","Data":"f16d80c354e022ab9301beb41c66fe0d46f31d12e153b169e65106f85241b071"} Apr 16 08:36:10.693322 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:10.693326 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" event={"ID":"564b4b71-9646-4d83-bfa1-4eb02f22473a","Type":"ContainerStarted","Data":"cb06c84d4183f68d67b133611d8959b625ba069489646b31dca8696322d60289"} Apr 16 08:36:10.693731 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:10.693339 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" event={"ID":"564b4b71-9646-4d83-bfa1-4eb02f22473a","Type":"ContainerStarted","Data":"9fd38ec0a8762cbb5eb5a02d5f57f8acc8b9e6225ad1aef51b8c6a60598602ce"} Apr 16 08:36:11.698782 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:11.698741 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" event={"ID":"564b4b71-9646-4d83-bfa1-4eb02f22473a","Type":"ContainerStarted","Data":"1c5c1b138da46f08321407168a6917c4c5f3a400e177af219d5d647afa161c3a"} Apr 16 08:36:11.698782 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:11.698787 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" event={"ID":"564b4b71-9646-4d83-bfa1-4eb02f22473a","Type":"ContainerStarted","Data":"0e9b7e558baf396b8f4c7a77b100a0cb84566fc4644a3edaf541a6a30caaabdc"} Apr 16 08:36:11.699248 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:11.698801 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" event={"ID":"564b4b71-9646-4d83-bfa1-4eb02f22473a","Type":"ContainerStarted","Data":"64ba26f163e43715fbde74aaeea1fe78f50b66e9a797c291159fb5c993b193dd"} Apr 16 08:36:11.699248 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:11.698929 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:36:11.723298 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:11.723251 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" podStartSLOduration=1.145805909 podStartE2EDuration="3.723237451s" podCreationTimestamp="2026-04-16 08:36:08 +0000 UTC" firstStartedPulling="2026-04-16 08:36:08.522861196 +0000 UTC m=+181.929900136" lastFinishedPulling="2026-04-16 08:36:11.100292736 +0000 UTC m=+184.507331678" observedRunningTime="2026-04-16 08:36:11.722929546 +0000 UTC m=+185.129968509" watchObservedRunningTime="2026-04-16 08:36:11.723237451 +0000 UTC m=+185.130276629" Apr 16 08:36:17.708862 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:36:17.708821 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-588b49b58c-x8xxw" Apr 16 08:37:19.065796 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:37:19.065695 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs\") pod \"network-metrics-daemon-vdxz4\" (UID: \"53e35ca8-ec77-48b7-8e96-ae73f7083c85\") " pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:37:19.068206 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:37:19.068178 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53e35ca8-ec77-48b7-8e96-ae73f7083c85-metrics-certs\") pod \"network-metrics-daemon-vdxz4\" (UID: \"53e35ca8-ec77-48b7-8e96-ae73f7083c85\") " pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:37:19.089876 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:37:19.089851 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kd4nk\"" Apr 16 08:37:19.098008 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:37:19.097982 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vdxz4" Apr 16 08:37:19.218797 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:37:19.218762 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vdxz4"] Apr 16 08:37:19.222827 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:37:19.222805 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53e35ca8_ec77_48b7_8e96_ae73f7083c85.slice/crio-baf34aac0faf17fa9d0e24790c6909afc89459d3d6c6a8d138a026d7ffd5451e WatchSource:0}: Error finding container baf34aac0faf17fa9d0e24790c6909afc89459d3d6c6a8d138a026d7ffd5451e: Status 404 returned error can't find the container with id baf34aac0faf17fa9d0e24790c6909afc89459d3d6c6a8d138a026d7ffd5451e Apr 16 08:37:19.873415 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:37:19.873370 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vdxz4" event={"ID":"53e35ca8-ec77-48b7-8e96-ae73f7083c85","Type":"ContainerStarted","Data":"baf34aac0faf17fa9d0e24790c6909afc89459d3d6c6a8d138a026d7ffd5451e"} Apr 16 08:37:20.877478 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:37:20.877446 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vdxz4" event={"ID":"53e35ca8-ec77-48b7-8e96-ae73f7083c85","Type":"ContainerStarted","Data":"c9396140f9952426a36ea7920f15a08dbb0e585f080b73e70b56111fdb37d166"} Apr 16 08:37:20.877478 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:37:20.877483 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vdxz4" event={"ID":"53e35ca8-ec77-48b7-8e96-ae73f7083c85","Type":"ContainerStarted","Data":"3fe4e4089abbf85182273302616989adbcc6b5c468ac782cf70577cd3488ab24"} Apr 16 08:37:20.896459 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:37:20.896416 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vdxz4" podStartSLOduration=253.003925106 podStartE2EDuration="4m13.896399639s" podCreationTimestamp="2026-04-16 08:33:07 +0000 UTC" firstStartedPulling="2026-04-16 08:37:19.224706859 +0000 UTC m=+252.631745800" lastFinishedPulling="2026-04-16 08:37:20.11718139 +0000 UTC m=+253.524220333" observedRunningTime="2026-04-16 08:37:20.894674503 +0000 UTC m=+254.301713464" watchObservedRunningTime="2026-04-16 08:37:20.896399639 +0000 UTC m=+254.303438603" Apr 16 08:38:07.107223 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:07.107189 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/1.log" Apr 16 08:38:07.107778 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:07.107532 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/1.log" Apr 16 08:38:07.114043 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:07.114019 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 08:38:07.114568 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:07.114549 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 08:38:07.117432 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:07.117415 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 08:38:46.680598 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:46.680571 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f84b8fc8c-nvks2"] Apr 16 08:38:46.683467 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:46.683452 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f84b8fc8c-nvks2" Apr 16 08:38:46.686291 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:46.686264 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 08:38:46.686291 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:46.686285 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 08:38:46.687223 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:46.687204 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 08:38:46.687347 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:46.687204 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 08:38:46.692308 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:46.692287 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f84b8fc8c-nvks2"] Apr 16 08:38:46.790130 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:46.790104 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/0e96503c-0c6c-4ae6-9f93-c32e5f346d60-klusterlet-config\") pod \"klusterlet-addon-workmgr-7f84b8fc8c-nvks2\" (UID: \"0e96503c-0c6c-4ae6-9f93-c32e5f346d60\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f84b8fc8c-nvks2" Apr 16 08:38:46.790286 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:46.790137 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e96503c-0c6c-4ae6-9f93-c32e5f346d60-tmp\") pod \"klusterlet-addon-workmgr-7f84b8fc8c-nvks2\" (UID: \"0e96503c-0c6c-4ae6-9f93-c32e5f346d60\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f84b8fc8c-nvks2" Apr 16 08:38:46.790286 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:46.790158 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xckh\" (UniqueName: \"kubernetes.io/projected/0e96503c-0c6c-4ae6-9f93-c32e5f346d60-kube-api-access-5xckh\") pod \"klusterlet-addon-workmgr-7f84b8fc8c-nvks2\" (UID: \"0e96503c-0c6c-4ae6-9f93-c32e5f346d60\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f84b8fc8c-nvks2" Apr 16 08:38:46.890853 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:46.890822 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/0e96503c-0c6c-4ae6-9f93-c32e5f346d60-klusterlet-config\") pod \"klusterlet-addon-workmgr-7f84b8fc8c-nvks2\" (UID: \"0e96503c-0c6c-4ae6-9f93-c32e5f346d60\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f84b8fc8c-nvks2" Apr 16 08:38:46.890853 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:46.890858 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e96503c-0c6c-4ae6-9f93-c32e5f346d60-tmp\") pod \"klusterlet-addon-workmgr-7f84b8fc8c-nvks2\" (UID: \"0e96503c-0c6c-4ae6-9f93-c32e5f346d60\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f84b8fc8c-nvks2" Apr 16 08:38:46.891112 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:46.890882 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xckh\" (UniqueName: \"kubernetes.io/projected/0e96503c-0c6c-4ae6-9f93-c32e5f346d60-kube-api-access-5xckh\") pod \"klusterlet-addon-workmgr-7f84b8fc8c-nvks2\" (UID: \"0e96503c-0c6c-4ae6-9f93-c32e5f346d60\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f84b8fc8c-nvks2" Apr 16 08:38:46.891246 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:46.891225 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e96503c-0c6c-4ae6-9f93-c32e5f346d60-tmp\") pod \"klusterlet-addon-workmgr-7f84b8fc8c-nvks2\" (UID: \"0e96503c-0c6c-4ae6-9f93-c32e5f346d60\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f84b8fc8c-nvks2" Apr 16 08:38:46.893710 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:46.893682 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/0e96503c-0c6c-4ae6-9f93-c32e5f346d60-klusterlet-config\") pod \"klusterlet-addon-workmgr-7f84b8fc8c-nvks2\" (UID: \"0e96503c-0c6c-4ae6-9f93-c32e5f346d60\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f84b8fc8c-nvks2" Apr 16 08:38:46.900696 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:46.900676 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xckh\" (UniqueName: \"kubernetes.io/projected/0e96503c-0c6c-4ae6-9f93-c32e5f346d60-kube-api-access-5xckh\") pod \"klusterlet-addon-workmgr-7f84b8fc8c-nvks2\" (UID: \"0e96503c-0c6c-4ae6-9f93-c32e5f346d60\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f84b8fc8c-nvks2" Apr 16 08:38:46.993049 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:46.992944 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f84b8fc8c-nvks2" Apr 16 08:38:47.111168 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:47.111137 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f84b8fc8c-nvks2"] Apr 16 08:38:47.113997 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:38:47.113970 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e96503c_0c6c_4ae6_9f93_c32e5f346d60.slice/crio-3f260328f0faeb8f07715386723360845ff67009f4a34a47c66660fc1afa73b3 WatchSource:0}: Error finding container 3f260328f0faeb8f07715386723360845ff67009f4a34a47c66660fc1afa73b3: Status 404 returned error can't find the container with id 3f260328f0faeb8f07715386723360845ff67009f4a34a47c66660fc1afa73b3 Apr 16 08:38:47.115448 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:47.115433 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 08:38:48.108824 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:48.108764 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f84b8fc8c-nvks2" event={"ID":"0e96503c-0c6c-4ae6-9f93-c32e5f346d60","Type":"ContainerStarted","Data":"3f260328f0faeb8f07715386723360845ff67009f4a34a47c66660fc1afa73b3"} Apr 16 08:38:51.118548 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:51.118511 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f84b8fc8c-nvks2" event={"ID":"0e96503c-0c6c-4ae6-9f93-c32e5f346d60","Type":"ContainerStarted","Data":"a61e0986cfe5c3fa522cf10ecd3087ef6df58da17b4fab90318f076cbb2468ca"} Apr 16 08:38:51.119010 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:51.118879 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f84b8fc8c-nvks2" Apr 16 08:38:51.120257 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:51.120237 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f84b8fc8c-nvks2" Apr 16 08:38:51.136359 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:38:51.136317 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7f84b8fc8c-nvks2" podStartSLOduration=1.758517993 podStartE2EDuration="5.136303849s" podCreationTimestamp="2026-04-16 08:38:46 +0000 UTC" firstStartedPulling="2026-04-16 08:38:47.11562992 +0000 UTC m=+340.522668863" lastFinishedPulling="2026-04-16 08:38:50.493415776 +0000 UTC m=+343.900454719" observedRunningTime="2026-04-16 08:38:51.135048857 +0000 UTC m=+344.542087820" watchObservedRunningTime="2026-04-16 08:38:51.136303849 +0000 UTC m=+344.543342811" Apr 16 08:39:15.085408 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:15.085369 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7"] Apr 16 08:39:15.089022 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:15.089001 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7" Apr 16 08:39:15.094233 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:15.094212 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 08:39:15.094345 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:15.094269 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 08:39:15.094345 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:15.094270 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-7rdkl\"" Apr 16 08:39:15.098479 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:15.098459 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7"] Apr 16 08:39:15.210023 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:15.209994 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z48sz\" (UniqueName: \"kubernetes.io/projected/c19c135f-84a0-4d06-b7de-b70656e7f7f8-kube-api-access-z48sz\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7\" (UID: \"c19c135f-84a0-4d06-b7de-b70656e7f7f8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7" Apr 16 08:39:15.210183 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:15.210038 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c19c135f-84a0-4d06-b7de-b70656e7f7f8-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7\" (UID: \"c19c135f-84a0-4d06-b7de-b70656e7f7f8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7" Apr 16 08:39:15.210183 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:15.210072 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c19c135f-84a0-4d06-b7de-b70656e7f7f8-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7\" (UID: \"c19c135f-84a0-4d06-b7de-b70656e7f7f8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7" Apr 16 08:39:15.311303 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:15.311273 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c19c135f-84a0-4d06-b7de-b70656e7f7f8-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7\" (UID: \"c19c135f-84a0-4d06-b7de-b70656e7f7f8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7" Apr 16 08:39:15.311499 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:15.311333 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z48sz\" (UniqueName: \"kubernetes.io/projected/c19c135f-84a0-4d06-b7de-b70656e7f7f8-kube-api-access-z48sz\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7\" (UID: \"c19c135f-84a0-4d06-b7de-b70656e7f7f8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7" Apr 16 08:39:15.311499 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:15.311362 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c19c135f-84a0-4d06-b7de-b70656e7f7f8-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7\" (UID: \"c19c135f-84a0-4d06-b7de-b70656e7f7f8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7" Apr 16 08:39:15.311681 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:15.311667 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c19c135f-84a0-4d06-b7de-b70656e7f7f8-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7\" (UID: \"c19c135f-84a0-4d06-b7de-b70656e7f7f8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7" Apr 16 08:39:15.311716 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:15.311678 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c19c135f-84a0-4d06-b7de-b70656e7f7f8-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7\" (UID: \"c19c135f-84a0-4d06-b7de-b70656e7f7f8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7" Apr 16 08:39:15.321346 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:15.321324 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z48sz\" (UniqueName: \"kubernetes.io/projected/c19c135f-84a0-4d06-b7de-b70656e7f7f8-kube-api-access-z48sz\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7\" (UID: \"c19c135f-84a0-4d06-b7de-b70656e7f7f8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7" Apr 16 08:39:15.403234 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:15.403167 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7" Apr 16 08:39:15.523924 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:15.523877 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7"] Apr 16 08:39:15.527193 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:39:15.527166 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc19c135f_84a0_4d06_b7de_b70656e7f7f8.slice/crio-b5bc3d337a5354896b8839809dbebf180cafbd64d61d875b527ddd8f32db68ae WatchSource:0}: Error finding container b5bc3d337a5354896b8839809dbebf180cafbd64d61d875b527ddd8f32db68ae: Status 404 returned error can't find the container with id b5bc3d337a5354896b8839809dbebf180cafbd64d61d875b527ddd8f32db68ae Apr 16 08:39:16.184392 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:16.184351 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7" event={"ID":"c19c135f-84a0-4d06-b7de-b70656e7f7f8","Type":"ContainerStarted","Data":"b5bc3d337a5354896b8839809dbebf180cafbd64d61d875b527ddd8f32db68ae"} Apr 16 08:39:21.200848 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:21.200817 2579 generic.go:358] "Generic (PLEG): container finished" podID="c19c135f-84a0-4d06-b7de-b70656e7f7f8" containerID="2b5cc9fb02ac852fd89ce1b1397198738f5d3e91cef41abdc9ecb30cb3f19857" exitCode=0 Apr 16 08:39:21.201264 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:21.200930 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7" event={"ID":"c19c135f-84a0-4d06-b7de-b70656e7f7f8","Type":"ContainerDied","Data":"2b5cc9fb02ac852fd89ce1b1397198738f5d3e91cef41abdc9ecb30cb3f19857"} Apr 16 08:39:25.211949 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:25.211912 2579 generic.go:358] "Generic (PLEG): container finished" podID="c19c135f-84a0-4d06-b7de-b70656e7f7f8" containerID="5ba0093c2468e9f0a56aa56694fbb864e3a6d0c90838a91763e164eb64221f82" exitCode=0 Apr 16 08:39:25.212333 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:25.211970 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7" event={"ID":"c19c135f-84a0-4d06-b7de-b70656e7f7f8","Type":"ContainerDied","Data":"5ba0093c2468e9f0a56aa56694fbb864e3a6d0c90838a91763e164eb64221f82"} Apr 16 08:39:31.229514 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:31.229423 2579 generic.go:358] "Generic (PLEG): container finished" podID="c19c135f-84a0-4d06-b7de-b70656e7f7f8" containerID="94ce0a9e0c78312d2ba50ad2effa80e6e7f50ca5d1a06633bb4b42b20ba709d5" exitCode=0 Apr 16 08:39:31.229514 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:31.229459 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7" event={"ID":"c19c135f-84a0-4d06-b7de-b70656e7f7f8","Type":"ContainerDied","Data":"94ce0a9e0c78312d2ba50ad2effa80e6e7f50ca5d1a06633bb4b42b20ba709d5"} Apr 16 08:39:32.352651 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:32.352630 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7" Apr 16 08:39:32.446731 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:32.446697 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z48sz\" (UniqueName: \"kubernetes.io/projected/c19c135f-84a0-4d06-b7de-b70656e7f7f8-kube-api-access-z48sz\") pod \"c19c135f-84a0-4d06-b7de-b70656e7f7f8\" (UID: \"c19c135f-84a0-4d06-b7de-b70656e7f7f8\") " Apr 16 08:39:32.446925 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:32.446756 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c19c135f-84a0-4d06-b7de-b70656e7f7f8-bundle\") pod \"c19c135f-84a0-4d06-b7de-b70656e7f7f8\" (UID: \"c19c135f-84a0-4d06-b7de-b70656e7f7f8\") " Apr 16 08:39:32.446925 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:32.446795 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c19c135f-84a0-4d06-b7de-b70656e7f7f8-util\") pod \"c19c135f-84a0-4d06-b7de-b70656e7f7f8\" (UID: \"c19c135f-84a0-4d06-b7de-b70656e7f7f8\") " Apr 16 08:39:32.447368 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:32.447301 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c19c135f-84a0-4d06-b7de-b70656e7f7f8-bundle" (OuterVolumeSpecName: "bundle") pod "c19c135f-84a0-4d06-b7de-b70656e7f7f8" (UID: "c19c135f-84a0-4d06-b7de-b70656e7f7f8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 08:39:32.448998 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:32.448973 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c19c135f-84a0-4d06-b7de-b70656e7f7f8-kube-api-access-z48sz" (OuterVolumeSpecName: "kube-api-access-z48sz") pod "c19c135f-84a0-4d06-b7de-b70656e7f7f8" (UID: "c19c135f-84a0-4d06-b7de-b70656e7f7f8"). InnerVolumeSpecName "kube-api-access-z48sz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:39:32.450802 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:32.450776 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c19c135f-84a0-4d06-b7de-b70656e7f7f8-util" (OuterVolumeSpecName: "util") pod "c19c135f-84a0-4d06-b7de-b70656e7f7f8" (UID: "c19c135f-84a0-4d06-b7de-b70656e7f7f8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 08:39:32.547672 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:32.547593 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c19c135f-84a0-4d06-b7de-b70656e7f7f8-util\") on node \"ip-10-0-130-41.ec2.internal\" DevicePath \"\"" Apr 16 08:39:32.547672 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:32.547622 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z48sz\" (UniqueName: \"kubernetes.io/projected/c19c135f-84a0-4d06-b7de-b70656e7f7f8-kube-api-access-z48sz\") on node \"ip-10-0-130-41.ec2.internal\" DevicePath \"\"" Apr 16 08:39:32.547672 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:32.547631 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c19c135f-84a0-4d06-b7de-b70656e7f7f8-bundle\") on node \"ip-10-0-130-41.ec2.internal\" DevicePath \"\"" Apr 16 08:39:33.236056 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:33.236019 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7" event={"ID":"c19c135f-84a0-4d06-b7de-b70656e7f7f8","Type":"ContainerDied","Data":"b5bc3d337a5354896b8839809dbebf180cafbd64d61d875b527ddd8f32db68ae"} Apr 16 08:39:33.236056 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:33.236058 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5bc3d337a5354896b8839809dbebf180cafbd64d61d875b527ddd8f32db68ae" Apr 16 08:39:33.236234 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:33.236068 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57x8l7" Apr 16 08:39:37.466070 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:37.466035 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-975fm"] Apr 16 08:39:37.466407 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:37.466290 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c19c135f-84a0-4d06-b7de-b70656e7f7f8" containerName="extract" Apr 16 08:39:37.466407 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:37.466301 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c19c135f-84a0-4d06-b7de-b70656e7f7f8" containerName="extract" Apr 16 08:39:37.466407 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:37.466313 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c19c135f-84a0-4d06-b7de-b70656e7f7f8" containerName="pull" Apr 16 08:39:37.466407 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:37.466319 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c19c135f-84a0-4d06-b7de-b70656e7f7f8" containerName="pull" Apr 16 08:39:37.466407 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:37.466332 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c19c135f-84a0-4d06-b7de-b70656e7f7f8" containerName="util" Apr 16 08:39:37.466407 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:37.466338 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c19c135f-84a0-4d06-b7de-b70656e7f7f8" containerName="util" Apr 16 08:39:37.466407 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:37.466383 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c19c135f-84a0-4d06-b7de-b70656e7f7f8" containerName="extract" Apr 16 08:39:37.469498 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:37.469482 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-975fm" Apr 16 08:39:37.472095 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:37.472068 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 08:39:37.472208 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:37.472133 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-wqxd7\"" Apr 16 08:39:37.472277 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:37.472227 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 08:39:37.481663 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:37.481641 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-975fm"] Apr 16 08:39:37.586683 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:37.586654 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7007a6f4-3787-4c1d-a46d-ce26f0ca9b3b-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-975fm\" (UID: \"7007a6f4-3787-4c1d-a46d-ce26f0ca9b3b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-975fm" Apr 16 08:39:37.586802 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:37.586708 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rzj4\" (UniqueName: \"kubernetes.io/projected/7007a6f4-3787-4c1d-a46d-ce26f0ca9b3b-kube-api-access-5rzj4\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-975fm\" (UID: \"7007a6f4-3787-4c1d-a46d-ce26f0ca9b3b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-975fm" Apr 16 08:39:37.688048 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:37.688000 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rzj4\" (UniqueName: \"kubernetes.io/projected/7007a6f4-3787-4c1d-a46d-ce26f0ca9b3b-kube-api-access-5rzj4\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-975fm\" (UID: \"7007a6f4-3787-4c1d-a46d-ce26f0ca9b3b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-975fm" Apr 16 08:39:37.688203 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:37.688065 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7007a6f4-3787-4c1d-a46d-ce26f0ca9b3b-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-975fm\" (UID: \"7007a6f4-3787-4c1d-a46d-ce26f0ca9b3b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-975fm" Apr 16 08:39:37.688397 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:37.688380 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7007a6f4-3787-4c1d-a46d-ce26f0ca9b3b-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-975fm\" (UID: \"7007a6f4-3787-4c1d-a46d-ce26f0ca9b3b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-975fm" Apr 16 08:39:37.698218 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:37.698196 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rzj4\" (UniqueName: \"kubernetes.io/projected/7007a6f4-3787-4c1d-a46d-ce26f0ca9b3b-kube-api-access-5rzj4\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-975fm\" (UID: \"7007a6f4-3787-4c1d-a46d-ce26f0ca9b3b\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-975fm" Apr 16 08:39:37.778732 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:37.778663 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-975fm" Apr 16 08:39:37.902682 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:37.902628 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-975fm"] Apr 16 08:39:37.904838 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:39:37.904801 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7007a6f4_3787_4c1d_a46d_ce26f0ca9b3b.slice/crio-df0e79c218a04946079578426be48e8e79d84b945e1c631da44c0dd1c441d0b9 WatchSource:0}: Error finding container df0e79c218a04946079578426be48e8e79d84b945e1c631da44c0dd1c441d0b9: Status 404 returned error can't find the container with id df0e79c218a04946079578426be48e8e79d84b945e1c631da44c0dd1c441d0b9 Apr 16 08:39:38.250035 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:38.249996 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-975fm" event={"ID":"7007a6f4-3787-4c1d-a46d-ce26f0ca9b3b","Type":"ContainerStarted","Data":"df0e79c218a04946079578426be48e8e79d84b945e1c631da44c0dd1c441d0b9"} Apr 16 08:39:40.256835 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:40.256807 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-975fm" event={"ID":"7007a6f4-3787-4c1d-a46d-ce26f0ca9b3b","Type":"ContainerStarted","Data":"e86ebbd7d3a4059873ec849b094e549848c61948224ca7dd716934e960eb0e29"} Apr 16 08:39:40.280867 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:40.280683 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-975fm" podStartSLOduration=1.000314643 podStartE2EDuration="3.280665101s" podCreationTimestamp="2026-04-16 08:39:37 +0000 UTC" firstStartedPulling="2026-04-16 08:39:37.907296656 +0000 UTC m=+391.314335615" lastFinishedPulling="2026-04-16 08:39:40.187647133 +0000 UTC m=+393.594686073" observedRunningTime="2026-04-16 08:39:40.279042337 +0000 UTC m=+393.686081422" watchObservedRunningTime="2026-04-16 08:39:40.280665101 +0000 UTC m=+393.687704064" Apr 16 08:39:46.683843 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:46.683810 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-crg76"] Apr 16 08:39:46.687155 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:46.687139 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-crg76" Apr 16 08:39:46.689809 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:46.689786 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 08:39:46.690876 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:46.690860 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 08:39:46.690946 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:46.690914 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-x5hsq\"" Apr 16 08:39:46.695269 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:46.695249 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-crg76"] Apr 16 08:39:46.750390 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:46.750369 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bssq4\" (UniqueName: \"kubernetes.io/projected/1b63c059-e0cf-4fe2-8800-7da01c96ce33-kube-api-access-bssq4\") pod \"cert-manager-webhook-597b96b99b-crg76\" (UID: \"1b63c059-e0cf-4fe2-8800-7da01c96ce33\") " pod="cert-manager/cert-manager-webhook-597b96b99b-crg76" Apr 16 08:39:46.750484 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:46.750395 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b63c059-e0cf-4fe2-8800-7da01c96ce33-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-crg76\" (UID: \"1b63c059-e0cf-4fe2-8800-7da01c96ce33\") " pod="cert-manager/cert-manager-webhook-597b96b99b-crg76" Apr 16 08:39:46.851541 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:46.851515 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bssq4\" (UniqueName: \"kubernetes.io/projected/1b63c059-e0cf-4fe2-8800-7da01c96ce33-kube-api-access-bssq4\") pod \"cert-manager-webhook-597b96b99b-crg76\" (UID: \"1b63c059-e0cf-4fe2-8800-7da01c96ce33\") " pod="cert-manager/cert-manager-webhook-597b96b99b-crg76" Apr 16 08:39:46.851661 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:46.851546 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b63c059-e0cf-4fe2-8800-7da01c96ce33-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-crg76\" (UID: \"1b63c059-e0cf-4fe2-8800-7da01c96ce33\") " pod="cert-manager/cert-manager-webhook-597b96b99b-crg76" Apr 16 08:39:46.860854 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:46.860825 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b63c059-e0cf-4fe2-8800-7da01c96ce33-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-crg76\" (UID: \"1b63c059-e0cf-4fe2-8800-7da01c96ce33\") " pod="cert-manager/cert-manager-webhook-597b96b99b-crg76" Apr 16 08:39:46.860981 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:46.860880 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bssq4\" (UniqueName: \"kubernetes.io/projected/1b63c059-e0cf-4fe2-8800-7da01c96ce33-kube-api-access-bssq4\") pod \"cert-manager-webhook-597b96b99b-crg76\" (UID: \"1b63c059-e0cf-4fe2-8800-7da01c96ce33\") " pod="cert-manager/cert-manager-webhook-597b96b99b-crg76" Apr 16 08:39:47.012132 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:47.012068 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-crg76" Apr 16 08:39:47.133177 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:47.133145 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-crg76"] Apr 16 08:39:47.136152 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:39:47.136128 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b63c059_e0cf_4fe2_8800_7da01c96ce33.slice/crio-cd5d28ee80706c0a1550531e774f939ae9206e0a9e6f43c1cf6a41a78c595e88 WatchSource:0}: Error finding container cd5d28ee80706c0a1550531e774f939ae9206e0a9e6f43c1cf6a41a78c595e88: Status 404 returned error can't find the container with id cd5d28ee80706c0a1550531e774f939ae9206e0a9e6f43c1cf6a41a78c595e88 Apr 16 08:39:47.276346 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:47.276272 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-crg76" event={"ID":"1b63c059-e0cf-4fe2-8800-7da01c96ce33","Type":"ContainerStarted","Data":"cd5d28ee80706c0a1550531e774f939ae9206e0a9e6f43c1cf6a41a78c595e88"} Apr 16 08:39:50.286931 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:50.286833 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-crg76" event={"ID":"1b63c059-e0cf-4fe2-8800-7da01c96ce33","Type":"ContainerStarted","Data":"d1cfc0249c31b45d782b6743b79189de06348d4d127dbde221b576ac437a48f3"} Apr 16 08:39:50.286931 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:50.286916 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-crg76" Apr 16 08:39:50.307315 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:50.307271 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-crg76" podStartSLOduration=1.525221599 podStartE2EDuration="4.307257984s" podCreationTimestamp="2026-04-16 08:39:46 +0000 UTC" firstStartedPulling="2026-04-16 08:39:47.138057803 +0000 UTC m=+400.545096751" lastFinishedPulling="2026-04-16 08:39:49.920094192 +0000 UTC m=+403.327133136" observedRunningTime="2026-04-16 08:39:50.305588384 +0000 UTC m=+403.712627348" watchObservedRunningTime="2026-04-16 08:39:50.307257984 +0000 UTC m=+403.714296946" Apr 16 08:39:56.291523 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:39:56.291488 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-crg76" Apr 16 08:40:05.056861 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:05.056825 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg"] Apr 16 08:40:05.064383 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:05.064362 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg" Apr 16 08:40:05.067165 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:05.067142 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 08:40:05.067971 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:05.067937 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-7rdkl\"" Apr 16 08:40:05.068086 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:05.067944 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 08:40:05.070429 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:05.070403 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg"] Apr 16 08:40:05.189983 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:05.189950 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8xhd\" (UniqueName: \"kubernetes.io/projected/6799881c-e6e8-44f4-9ab6-5fc0cd5ae425-kube-api-access-v8xhd\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg\" (UID: \"6799881c-e6e8-44f4-9ab6-5fc0cd5ae425\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg" Apr 16 08:40:05.190140 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:05.190010 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6799881c-e6e8-44f4-9ab6-5fc0cd5ae425-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg\" (UID: \"6799881c-e6e8-44f4-9ab6-5fc0cd5ae425\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg" Apr 16 08:40:05.190140 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:05.190055 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6799881c-e6e8-44f4-9ab6-5fc0cd5ae425-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg\" (UID: \"6799881c-e6e8-44f4-9ab6-5fc0cd5ae425\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg" Apr 16 08:40:05.290947 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:05.290912 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8xhd\" (UniqueName: \"kubernetes.io/projected/6799881c-e6e8-44f4-9ab6-5fc0cd5ae425-kube-api-access-v8xhd\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg\" (UID: \"6799881c-e6e8-44f4-9ab6-5fc0cd5ae425\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg" Apr 16 08:40:05.291117 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:05.291062 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6799881c-e6e8-44f4-9ab6-5fc0cd5ae425-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg\" (UID: \"6799881c-e6e8-44f4-9ab6-5fc0cd5ae425\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg" Apr 16 08:40:05.291117 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:05.291089 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6799881c-e6e8-44f4-9ab6-5fc0cd5ae425-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg\" (UID: \"6799881c-e6e8-44f4-9ab6-5fc0cd5ae425\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg" Apr 16 08:40:05.291570 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:05.291546 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6799881c-e6e8-44f4-9ab6-5fc0cd5ae425-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg\" (UID: \"6799881c-e6e8-44f4-9ab6-5fc0cd5ae425\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg" Apr 16 08:40:05.291626 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:05.291568 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6799881c-e6e8-44f4-9ab6-5fc0cd5ae425-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg\" (UID: \"6799881c-e6e8-44f4-9ab6-5fc0cd5ae425\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg" Apr 16 08:40:05.301145 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:05.301117 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8xhd\" (UniqueName: \"kubernetes.io/projected/6799881c-e6e8-44f4-9ab6-5fc0cd5ae425-kube-api-access-v8xhd\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg\" (UID: \"6799881c-e6e8-44f4-9ab6-5fc0cd5ae425\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg" Apr 16 08:40:05.374433 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:05.374408 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg" Apr 16 08:40:05.496071 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:05.496032 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg"] Apr 16 08:40:05.499008 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:40:05.498976 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6799881c_e6e8_44f4_9ab6_5fc0cd5ae425.slice/crio-79ca7a233454c2681451741f001074b082a581fa96f6ccf6a01bf754f6fd8b6d WatchSource:0}: Error finding container 79ca7a233454c2681451741f001074b082a581fa96f6ccf6a01bf754f6fd8b6d: Status 404 returned error can't find the container with id 79ca7a233454c2681451741f001074b082a581fa96f6ccf6a01bf754f6fd8b6d Apr 16 08:40:06.337198 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:06.337161 2579 generic.go:358] "Generic (PLEG): container finished" podID="6799881c-e6e8-44f4-9ab6-5fc0cd5ae425" containerID="d5f7fa7cbf0ad33d344d69e8597e1406541b2f63bda4aa33da3d22b1ac153ff1" exitCode=0 Apr 16 08:40:06.337521 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:06.337243 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg" event={"ID":"6799881c-e6e8-44f4-9ab6-5fc0cd5ae425","Type":"ContainerDied","Data":"d5f7fa7cbf0ad33d344d69e8597e1406541b2f63bda4aa33da3d22b1ac153ff1"} Apr 16 08:40:06.337521 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:06.337275 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg" event={"ID":"6799881c-e6e8-44f4-9ab6-5fc0cd5ae425","Type":"ContainerStarted","Data":"79ca7a233454c2681451741f001074b082a581fa96f6ccf6a01bf754f6fd8b6d"} Apr 16 08:40:09.347614 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:09.347581 2579 generic.go:358] "Generic (PLEG): container finished" podID="6799881c-e6e8-44f4-9ab6-5fc0cd5ae425" containerID="d1ea969bc24a5f97b6702b8027876edd4331c58d4b9089776de843c2d67cba7f" exitCode=0 Apr 16 08:40:09.348012 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:09.347682 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg" event={"ID":"6799881c-e6e8-44f4-9ab6-5fc0cd5ae425","Type":"ContainerDied","Data":"d1ea969bc24a5f97b6702b8027876edd4331c58d4b9089776de843c2d67cba7f"} Apr 16 08:40:10.355318 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:10.355284 2579 generic.go:358] "Generic (PLEG): container finished" podID="6799881c-e6e8-44f4-9ab6-5fc0cd5ae425" containerID="743f8bf4e47673db5d94ecaac29e30c3b666b16ecec0b13530ee469e124c2fd0" exitCode=0 Apr 16 08:40:10.355775 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:10.355354 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg" event={"ID":"6799881c-e6e8-44f4-9ab6-5fc0cd5ae425","Type":"ContainerDied","Data":"743f8bf4e47673db5d94ecaac29e30c3b666b16ecec0b13530ee469e124c2fd0"} Apr 16 08:40:11.473401 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:11.473379 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg" Apr 16 08:40:11.537646 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:11.537566 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6799881c-e6e8-44f4-9ab6-5fc0cd5ae425-util\") pod \"6799881c-e6e8-44f4-9ab6-5fc0cd5ae425\" (UID: \"6799881c-e6e8-44f4-9ab6-5fc0cd5ae425\") " Apr 16 08:40:11.537771 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:11.537710 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6799881c-e6e8-44f4-9ab6-5fc0cd5ae425-bundle\") pod \"6799881c-e6e8-44f4-9ab6-5fc0cd5ae425\" (UID: \"6799881c-e6e8-44f4-9ab6-5fc0cd5ae425\") " Apr 16 08:40:11.537771 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:11.537749 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8xhd\" (UniqueName: \"kubernetes.io/projected/6799881c-e6e8-44f4-9ab6-5fc0cd5ae425-kube-api-access-v8xhd\") pod \"6799881c-e6e8-44f4-9ab6-5fc0cd5ae425\" (UID: \"6799881c-e6e8-44f4-9ab6-5fc0cd5ae425\") " Apr 16 08:40:11.538102 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:11.538075 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6799881c-e6e8-44f4-9ab6-5fc0cd5ae425-bundle" (OuterVolumeSpecName: "bundle") pod "6799881c-e6e8-44f4-9ab6-5fc0cd5ae425" (UID: "6799881c-e6e8-44f4-9ab6-5fc0cd5ae425"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 08:40:11.539979 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:11.539950 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6799881c-e6e8-44f4-9ab6-5fc0cd5ae425-kube-api-access-v8xhd" (OuterVolumeSpecName: "kube-api-access-v8xhd") pod "6799881c-e6e8-44f4-9ab6-5fc0cd5ae425" (UID: "6799881c-e6e8-44f4-9ab6-5fc0cd5ae425"). InnerVolumeSpecName "kube-api-access-v8xhd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:40:11.544119 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:11.544096 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6799881c-e6e8-44f4-9ab6-5fc0cd5ae425-util" (OuterVolumeSpecName: "util") pod "6799881c-e6e8-44f4-9ab6-5fc0cd5ae425" (UID: "6799881c-e6e8-44f4-9ab6-5fc0cd5ae425"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 08:40:11.638557 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:11.638492 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6799881c-e6e8-44f4-9ab6-5fc0cd5ae425-bundle\") on node \"ip-10-0-130-41.ec2.internal\" DevicePath \"\"" Apr 16 08:40:11.638557 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:11.638524 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v8xhd\" (UniqueName: \"kubernetes.io/projected/6799881c-e6e8-44f4-9ab6-5fc0cd5ae425-kube-api-access-v8xhd\") on node \"ip-10-0-130-41.ec2.internal\" DevicePath \"\"" Apr 16 08:40:11.638557 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:11.638533 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6799881c-e6e8-44f4-9ab6-5fc0cd5ae425-util\") on node \"ip-10-0-130-41.ec2.internal\" DevicePath \"\"" Apr 16 08:40:12.362735 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:12.362679 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg" event={"ID":"6799881c-e6e8-44f4-9ab6-5fc0cd5ae425","Type":"ContainerDied","Data":"79ca7a233454c2681451741f001074b082a581fa96f6ccf6a01bf754f6fd8b6d"} Apr 16 08:40:12.362735 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:12.362733 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79ca7a233454c2681451741f001074b082a581fa96f6ccf6a01bf754f6fd8b6d" Apr 16 08:40:12.362735 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:12.362714 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78e8z5fg" Apr 16 08:40:27.433909 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.433811 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt"] Apr 16 08:40:27.434366 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.434156 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6799881c-e6e8-44f4-9ab6-5fc0cd5ae425" containerName="extract" Apr 16 08:40:27.434366 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.434171 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6799881c-e6e8-44f4-9ab6-5fc0cd5ae425" containerName="extract" Apr 16 08:40:27.434366 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.434186 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6799881c-e6e8-44f4-9ab6-5fc0cd5ae425" containerName="util" Apr 16 08:40:27.434366 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.434191 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6799881c-e6e8-44f4-9ab6-5fc0cd5ae425" containerName="util" Apr 16 08:40:27.434366 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.434198 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6799881c-e6e8-44f4-9ab6-5fc0cd5ae425" containerName="pull" Apr 16 08:40:27.434366 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.434203 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6799881c-e6e8-44f4-9ab6-5fc0cd5ae425" containerName="pull" Apr 16 08:40:27.434366 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.434255 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6799881c-e6e8-44f4-9ab6-5fc0cd5ae425" containerName="extract" Apr 16 08:40:27.437805 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.437789 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt" Apr 16 08:40:27.440557 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.440531 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 16 08:40:27.441704 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.441682 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-controller-manager-dockercfg-qgjlm\"" Apr 16 08:40:27.441815 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.441685 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 16 08:40:27.441815 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.441731 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"metrics-server-cert\"" Apr 16 08:40:27.441815 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.441745 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"webhook-server-cert\"" Apr 16 08:40:27.441815 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.441730 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"jobset-manager-config\"" Apr 16 08:40:27.445776 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.445758 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt"] Apr 16 08:40:27.556004 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.555969 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98b63693-8bf4-4549-9763-81507572febd-metrics-certs\") pod \"jobset-controller-manager-7b86fbfbb-tmxnt\" (UID: \"98b63693-8bf4-4549-9763-81507572febd\") " pod="openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt" Apr 16 08:40:27.556147 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.556010 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98b63693-8bf4-4549-9763-81507572febd-cert\") pod \"jobset-controller-manager-7b86fbfbb-tmxnt\" (UID: \"98b63693-8bf4-4549-9763-81507572febd\") " pod="openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt" Apr 16 08:40:27.556147 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.556087 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjrrg\" (UniqueName: \"kubernetes.io/projected/98b63693-8bf4-4549-9763-81507572febd-kube-api-access-sjrrg\") pod \"jobset-controller-manager-7b86fbfbb-tmxnt\" (UID: \"98b63693-8bf4-4549-9763-81507572febd\") " pod="openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt" Apr 16 08:40:27.556147 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.556112 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/98b63693-8bf4-4549-9763-81507572febd-manager-config\") pod \"jobset-controller-manager-7b86fbfbb-tmxnt\" (UID: \"98b63693-8bf4-4549-9763-81507572febd\") " pod="openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt" Apr 16 08:40:27.657210 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.657180 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjrrg\" (UniqueName: \"kubernetes.io/projected/98b63693-8bf4-4549-9763-81507572febd-kube-api-access-sjrrg\") pod \"jobset-controller-manager-7b86fbfbb-tmxnt\" (UID: \"98b63693-8bf4-4549-9763-81507572febd\") " pod="openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt" Apr 16 08:40:27.657381 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.657219 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/98b63693-8bf4-4549-9763-81507572febd-manager-config\") pod \"jobset-controller-manager-7b86fbfbb-tmxnt\" (UID: \"98b63693-8bf4-4549-9763-81507572febd\") " pod="openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt" Apr 16 08:40:27.657381 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.657272 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98b63693-8bf4-4549-9763-81507572febd-metrics-certs\") pod \"jobset-controller-manager-7b86fbfbb-tmxnt\" (UID: \"98b63693-8bf4-4549-9763-81507572febd\") " pod="openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt" Apr 16 08:40:27.657381 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.657305 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98b63693-8bf4-4549-9763-81507572febd-cert\") pod \"jobset-controller-manager-7b86fbfbb-tmxnt\" (UID: \"98b63693-8bf4-4549-9763-81507572febd\") " pod="openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt" Apr 16 08:40:27.658003 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.657948 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/98b63693-8bf4-4549-9763-81507572febd-manager-config\") pod \"jobset-controller-manager-7b86fbfbb-tmxnt\" (UID: \"98b63693-8bf4-4549-9763-81507572febd\") " pod="openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt" Apr 16 08:40:27.660002 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.659978 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98b63693-8bf4-4549-9763-81507572febd-cert\") pod \"jobset-controller-manager-7b86fbfbb-tmxnt\" (UID: \"98b63693-8bf4-4549-9763-81507572febd\") " pod="openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt" Apr 16 08:40:27.660102 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.660041 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98b63693-8bf4-4549-9763-81507572febd-metrics-certs\") pod \"jobset-controller-manager-7b86fbfbb-tmxnt\" (UID: \"98b63693-8bf4-4549-9763-81507572febd\") " pod="openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt" Apr 16 08:40:27.666936 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.666884 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjrrg\" (UniqueName: \"kubernetes.io/projected/98b63693-8bf4-4549-9763-81507572febd-kube-api-access-sjrrg\") pod \"jobset-controller-manager-7b86fbfbb-tmxnt\" (UID: \"98b63693-8bf4-4549-9763-81507572febd\") " pod="openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt" Apr 16 08:40:27.747731 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.747638 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt" Apr 16 08:40:27.868781 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:27.868755 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt"] Apr 16 08:40:27.870946 ip-10-0-130-41 kubenswrapper[2579]: W0416 08:40:27.870921 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98b63693_8bf4_4549_9763_81507572febd.slice/crio-d8209540e9235ddcfd769b5811b29b77ac99ee9b1f93e7d2d27d7b4ca9be3006 WatchSource:0}: Error finding container d8209540e9235ddcfd769b5811b29b77ac99ee9b1f93e7d2d27d7b4ca9be3006: Status 404 returned error can't find the container with id d8209540e9235ddcfd769b5811b29b77ac99ee9b1f93e7d2d27d7b4ca9be3006 Apr 16 08:40:28.411336 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:28.411301 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt" event={"ID":"98b63693-8bf4-4549-9763-81507572febd","Type":"ContainerStarted","Data":"d8209540e9235ddcfd769b5811b29b77ac99ee9b1f93e7d2d27d7b4ca9be3006"} Apr 16 08:40:30.418543 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:30.418513 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt" event={"ID":"98b63693-8bf4-4549-9763-81507572febd","Type":"ContainerStarted","Data":"03f4f9afc7a5a213d2587998b2a3a53c4536e2ea74d092438c9d2abc93a8c71a"} Apr 16 08:40:30.418960 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:30.418569 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt" Apr 16 08:40:30.437806 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:30.437758 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt" podStartSLOduration=1.8430718289999999 podStartE2EDuration="3.43774401s" podCreationTimestamp="2026-04-16 08:40:27 +0000 UTC" firstStartedPulling="2026-04-16 08:40:27.872769206 +0000 UTC m=+441.279808149" lastFinishedPulling="2026-04-16 08:40:29.467441388 +0000 UTC m=+442.874480330" observedRunningTime="2026-04-16 08:40:30.436747501 +0000 UTC m=+443.843786488" watchObservedRunningTime="2026-04-16 08:40:30.43774401 +0000 UTC m=+443.844782973" Apr 16 08:40:41.427182 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:40:41.427152 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-jobset-operator/jobset-controller-manager-7b86fbfbb-tmxnt" Apr 16 08:43:07.130864 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:43:07.130831 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/1.log" Apr 16 08:43:07.133647 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:43:07.133624 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/1.log" Apr 16 08:43:07.136182 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:43:07.136165 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 08:43:07.138765 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:43:07.138749 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 08:48:07.153425 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:48:07.153349 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/1.log" Apr 16 08:48:07.154779 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:48:07.154757 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/1.log" Apr 16 08:48:07.159547 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:48:07.159519 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 08:48:07.160312 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:48:07.160297 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 08:53:07.177242 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:53:07.177210 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/1.log" Apr 16 08:53:07.178820 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:53:07.178055 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/1.log" Apr 16 08:53:07.183010 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:53:07.182994 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 08:53:07.183629 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:53:07.183612 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 08:58:07.197332 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:58:07.197216 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/1.log" Apr 16 08:58:07.199840 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:58:07.199824 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/1.log" Apr 16 08:58:07.202463 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:58:07.202446 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 08:58:07.204971 ip-10-0-130-41 kubenswrapper[2579]: I0416 08:58:07.204957 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 09:03:07.219285 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:03:07.219181 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/1.log" Apr 16 09:03:07.226258 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:03:07.222157 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/1.log" Apr 16 09:03:07.226258 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:03:07.224794 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 09:03:07.227218 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:03:07.227202 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 09:08:07.239252 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:08:07.239154 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/1.log" Apr 16 09:08:07.244064 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:08:07.244044 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/1.log" Apr 16 09:08:07.244678 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:08:07.244657 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 09:08:07.248969 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:08:07.248953 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 09:13:07.259831 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:07.259713 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/1.log" Apr 16 09:13:07.265184 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:07.265160 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 09:13:07.265336 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:07.265265 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/1.log" Apr 16 09:13:07.270719 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:07.270704 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 09:13:34.932201 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:34.932099 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-ktjq7_fca15501-2323-4079-b35a-57d66f40d0b1/global-pull-secret-syncer/0.log" Apr 16 09:13:34.987418 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:34.987383 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-j7rrw_3ab93a09-2404-44a9-8381-13976f5a1595/konnectivity-agent/0.log" Apr 16 09:13:35.113113 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:35.113081 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-41.ec2.internal_2c43aaa946be06bbbb13a479a93166e2/haproxy/0.log" Apr 16 09:13:39.113325 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:39.113301 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ffnjx_8634d4e3-1dce-4678-b6d6-0e3e4f8118a1/node-exporter/0.log" Apr 16 09:13:39.135575 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:39.135542 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ffnjx_8634d4e3-1dce-4678-b6d6-0e3e4f8118a1/kube-rbac-proxy/0.log" Apr 16 09:13:39.159380 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:39.159359 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ffnjx_8634d4e3-1dce-4678-b6d6-0e3e4f8118a1/init-textfile/0.log" Apr 16 09:13:39.805694 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:39.805668 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-wh5jv_a3aebdf4-6bc4-473a-8c68-006e1dc30678/prometheus-operator-admission-webhook/0.log" Apr 16 09:13:39.928554 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:39.928530 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-588b49b58c-x8xxw_564b4b71-9646-4d83-bfa1-4eb02f22473a/thanos-query/0.log" Apr 16 09:13:39.953246 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:39.953173 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-588b49b58c-x8xxw_564b4b71-9646-4d83-bfa1-4eb02f22473a/kube-rbac-proxy-web/0.log" Apr 16 09:13:39.982476 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:39.982444 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-588b49b58c-x8xxw_564b4b71-9646-4d83-bfa1-4eb02f22473a/kube-rbac-proxy/0.log" Apr 16 09:13:40.009957 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:40.009934 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-588b49b58c-x8xxw_564b4b71-9646-4d83-bfa1-4eb02f22473a/prom-label-proxy/0.log" Apr 16 09:13:40.032751 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:40.032717 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-588b49b58c-x8xxw_564b4b71-9646-4d83-bfa1-4eb02f22473a/kube-rbac-proxy-rules/0.log" Apr 16 09:13:40.055685 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:40.055663 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-588b49b58c-x8xxw_564b4b71-9646-4d83-bfa1-4eb02f22473a/kube-rbac-proxy-metrics/0.log" Apr 16 09:13:41.472677 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:41.472646 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/1.log" Apr 16 09:13:41.481989 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:41.481949 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-5dsf6_e931af40-20ab-4a49-9b92-41dc1be13602/console-operator/2.log" Apr 16 09:13:42.369502 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.369465 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml"] Apr 16 09:13:42.372782 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.372762 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" Apr 16 09:13:42.375025 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.375004 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6sqkd\"/\"default-dockercfg-kg5b9\"" Apr 16 09:13:42.375197 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.375178 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6sqkd\"/\"openshift-service-ca.crt\"" Apr 16 09:13:42.376035 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.376020 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6sqkd\"/\"kube-root-ca.crt\"" Apr 16 09:13:42.381847 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.381826 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml"] Apr 16 09:13:42.449879 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.449850 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/da4b4557-952c-4a7a-9da8-5611fb9b8b17-podres\") pod \"perf-node-gather-daemonset-bvbml\" (UID: \"da4b4557-952c-4a7a-9da8-5611fb9b8b17\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" Apr 16 09:13:42.450008 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.449917 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km5n9\" (UniqueName: \"kubernetes.io/projected/da4b4557-952c-4a7a-9da8-5611fb9b8b17-kube-api-access-km5n9\") pod \"perf-node-gather-daemonset-bvbml\" (UID: \"da4b4557-952c-4a7a-9da8-5611fb9b8b17\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" Apr 16 09:13:42.450008 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.449965 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/da4b4557-952c-4a7a-9da8-5611fb9b8b17-lib-modules\") pod \"perf-node-gather-daemonset-bvbml\" (UID: \"da4b4557-952c-4a7a-9da8-5611fb9b8b17\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" Apr 16 09:13:42.450008 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.449987 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/da4b4557-952c-4a7a-9da8-5611fb9b8b17-proc\") pod \"perf-node-gather-daemonset-bvbml\" (UID: \"da4b4557-952c-4a7a-9da8-5611fb9b8b17\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" Apr 16 09:13:42.450112 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.450011 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/da4b4557-952c-4a7a-9da8-5611fb9b8b17-sys\") pod \"perf-node-gather-daemonset-bvbml\" (UID: \"da4b4557-952c-4a7a-9da8-5611fb9b8b17\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" Apr 16 09:13:42.550532 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.550504 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/da4b4557-952c-4a7a-9da8-5611fb9b8b17-lib-modules\") pod \"perf-node-gather-daemonset-bvbml\" (UID: \"da4b4557-952c-4a7a-9da8-5611fb9b8b17\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" Apr 16 09:13:42.550856 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.550540 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/da4b4557-952c-4a7a-9da8-5611fb9b8b17-proc\") pod \"perf-node-gather-daemonset-bvbml\" (UID: \"da4b4557-952c-4a7a-9da8-5611fb9b8b17\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" Apr 16 09:13:42.550856 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.550562 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/da4b4557-952c-4a7a-9da8-5611fb9b8b17-sys\") pod \"perf-node-gather-daemonset-bvbml\" (UID: \"da4b4557-952c-4a7a-9da8-5611fb9b8b17\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" Apr 16 09:13:42.550856 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.550617 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/da4b4557-952c-4a7a-9da8-5611fb9b8b17-podres\") pod \"perf-node-gather-daemonset-bvbml\" (UID: \"da4b4557-952c-4a7a-9da8-5611fb9b8b17\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" Apr 16 09:13:42.550856 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.550644 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/da4b4557-952c-4a7a-9da8-5611fb9b8b17-proc\") pod \"perf-node-gather-daemonset-bvbml\" (UID: \"da4b4557-952c-4a7a-9da8-5611fb9b8b17\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" Apr 16 09:13:42.550856 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.550665 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-km5n9\" (UniqueName: \"kubernetes.io/projected/da4b4557-952c-4a7a-9da8-5611fb9b8b17-kube-api-access-km5n9\") pod \"perf-node-gather-daemonset-bvbml\" (UID: \"da4b4557-952c-4a7a-9da8-5611fb9b8b17\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" Apr 16 09:13:42.550856 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.550678 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/da4b4557-952c-4a7a-9da8-5611fb9b8b17-lib-modules\") pod \"perf-node-gather-daemonset-bvbml\" (UID: \"da4b4557-952c-4a7a-9da8-5611fb9b8b17\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" Apr 16 09:13:42.550856 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.550715 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/da4b4557-952c-4a7a-9da8-5611fb9b8b17-sys\") pod \"perf-node-gather-daemonset-bvbml\" (UID: \"da4b4557-952c-4a7a-9da8-5611fb9b8b17\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" Apr 16 09:13:42.550856 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.550747 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/da4b4557-952c-4a7a-9da8-5611fb9b8b17-podres\") pod \"perf-node-gather-daemonset-bvbml\" (UID: \"da4b4557-952c-4a7a-9da8-5611fb9b8b17\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" Apr 16 09:13:42.559136 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.559116 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-km5n9\" (UniqueName: \"kubernetes.io/projected/da4b4557-952c-4a7a-9da8-5611fb9b8b17-kube-api-access-km5n9\") pod \"perf-node-gather-daemonset-bvbml\" (UID: \"da4b4557-952c-4a7a-9da8-5611fb9b8b17\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" Apr 16 09:13:42.682783 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.682717 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" Apr 16 09:13:42.801692 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.801669 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml"] Apr 16 09:13:42.804218 ip-10-0-130-41 kubenswrapper[2579]: W0416 09:13:42.804182 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podda4b4557_952c_4a7a_9da8_5611fb9b8b17.slice/crio-f359bef8d1018b54dd38894c485805fe0301a2d0c8b3b290580c368afa0501c4 WatchSource:0}: Error finding container f359bef8d1018b54dd38894c485805fe0301a2d0c8b3b290580c368afa0501c4: Status 404 returned error can't find the container with id f359bef8d1018b54dd38894c485805fe0301a2d0c8b3b290580c368afa0501c4 Apr 16 09:13:42.805742 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:42.805726 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 09:13:43.078750 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:43.078720 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-flzpm_7025f860-4946-43aa-9ebe-7d45f5616858/dns/0.log" Apr 16 09:13:43.101652 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:43.101634 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-flzpm_7025f860-4946-43aa-9ebe-7d45f5616858/kube-rbac-proxy/0.log" Apr 16 09:13:43.223243 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:43.223218 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6fwv8_9ab7948f-df7f-4fae-ad8e-a2cab21c427e/dns-node-resolver/0.log" Apr 16 09:13:43.553312 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:43.553224 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" event={"ID":"da4b4557-952c-4a7a-9da8-5611fb9b8b17","Type":"ContainerStarted","Data":"e7a4f87e242f47a645746b9f02434cc967ab5e773b81a9bcc4aef8cc0d1b5f44"} Apr 16 09:13:43.553312 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:43.553261 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" event={"ID":"da4b4557-952c-4a7a-9da8-5611fb9b8b17","Type":"ContainerStarted","Data":"f359bef8d1018b54dd38894c485805fe0301a2d0c8b3b290580c368afa0501c4"} Apr 16 09:13:43.553697 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:43.553355 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" Apr 16 09:13:43.569952 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:43.569912 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" podStartSLOduration=1.569884224 podStartE2EDuration="1.569884224s" podCreationTimestamp="2026-04-16 09:13:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 09:13:43.568777397 +0000 UTC m=+2436.975816363" watchObservedRunningTime="2026-04-16 09:13:43.569884224 +0000 UTC m=+2436.976923186" Apr 16 09:13:43.731799 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:43.731767 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-94967b795-42p2c_db8c63e6-b841-42a4-8465-5a51f82d5a40/registry/0.log" Apr 16 09:13:43.779929 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:43.779882 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hsx9c_2dd4e090-3d55-499c-a3fb-7a04e930e31c/node-ca/0.log" Apr 16 09:13:44.527949 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:44.527919 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-68d648d8c6-gdh8b_108ff4c3-fb90-4321-9fcb-90d3aeeab5bf/router/0.log" Apr 16 09:13:44.865454 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:44.865429 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-b9rrh_06b0854a-ae72-4622-b5bd-6803c4a6d119/serve-healthcheck-canary/0.log" Apr 16 09:13:45.376697 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:45.376663 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7cdb7_b9604c68-8fac-4d6c-bd52-7623732bd202/kube-rbac-proxy/0.log" Apr 16 09:13:45.399181 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:45.399156 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7cdb7_b9604c68-8fac-4d6c-bd52-7623732bd202/exporter/0.log" Apr 16 09:13:45.421619 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:45.421590 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7cdb7_b9604c68-8fac-4d6c-bd52-7623732bd202/extractor/0.log" Apr 16 09:13:47.267405 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:47.267372 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-controller-manager-7b86fbfbb-tmxnt_98b63693-8bf4-4549-9763-81507572febd/manager/0.log" Apr 16 09:13:49.567558 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:49.567529 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-bvbml" Apr 16 09:13:50.796655 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:50.796522 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-fs7cw_88ff5b6d-15ea-4646-8401-4e76a7424f8e/migrator/0.log" Apr 16 09:13:50.820656 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:50.820629 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-fs7cw_88ff5b6d-15ea-4646-8401-4e76a7424f8e/graceful-termination/0.log" Apr 16 09:13:52.308404 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:52.308378 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6cdbx_7330bfa4-019f-4fdd-bc61-5919a528f3e1/kube-multus-additional-cni-plugins/0.log" Apr 16 09:13:52.334069 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:52.334050 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6cdbx_7330bfa4-019f-4fdd-bc61-5919a528f3e1/egress-router-binary-copy/0.log" Apr 16 09:13:52.358722 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:52.358700 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6cdbx_7330bfa4-019f-4fdd-bc61-5919a528f3e1/cni-plugins/0.log" Apr 16 09:13:52.379742 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:52.379686 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6cdbx_7330bfa4-019f-4fdd-bc61-5919a528f3e1/bond-cni-plugin/0.log" Apr 16 09:13:52.402709 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:52.402676 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6cdbx_7330bfa4-019f-4fdd-bc61-5919a528f3e1/routeoverride-cni/0.log" Apr 16 09:13:52.439941 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:52.439920 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6cdbx_7330bfa4-019f-4fdd-bc61-5919a528f3e1/whereabouts-cni-bincopy/0.log" Apr 16 09:13:52.482678 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:52.482659 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6cdbx_7330bfa4-019f-4fdd-bc61-5919a528f3e1/whereabouts-cni/0.log" Apr 16 09:13:52.874974 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:52.874947 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mkvns_ed4414d4-f963-4c14-87d5-738798aeb287/kube-multus/0.log" Apr 16 09:13:53.035292 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:53.035264 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vdxz4_53e35ca8-ec77-48b7-8e96-ae73f7083c85/network-metrics-daemon/0.log" Apr 16 09:13:53.054523 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:53.054454 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vdxz4_53e35ca8-ec77-48b7-8e96-ae73f7083c85/kube-rbac-proxy/0.log" Apr 16 09:13:54.465362 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:54.465302 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-controller/0.log" Apr 16 09:13:54.483170 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:54.483144 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/0.log" Apr 16 09:13:54.505558 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:54.505533 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovn-acl-logging/1.log" Apr 16 09:13:54.527229 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:54.527207 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/kube-rbac-proxy-node/0.log" Apr 16 09:13:54.548906 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:54.548865 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 09:13:54.569194 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:54.569175 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/northd/0.log" Apr 16 09:13:54.589577 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:54.589557 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/nbdb/0.log" Apr 16 09:13:54.613454 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:54.613432 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/sbdb/0.log" Apr 16 09:13:54.772113 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:54.772047 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m577d_09a74c48-2de1-498a-893b-0fa1b8dbd0dd/ovnkube-controller/0.log" Apr 16 09:13:55.852254 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:55.852231 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-ntc92_3b6d17c9-2d51-47bc-9e36-95fb034872cb/network-check-target-container/0.log" Apr 16 09:13:56.812263 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:56.812238 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-fpxkh_da769e7e-5234-4962-b0d5-107292fb0b9f/iptables-alerter/0.log" Apr 16 09:13:57.528614 ip-10-0-130-41 kubenswrapper[2579]: I0416 09:13:57.528584 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-t6g8t_bc5726ae-d0b1-473a-9fd1-b1085d2b108e/tuned/0.log"